Tl;dr Approximate Inference methods made easy
<p>A closed-form solution to a machine learning model is one that can be written down on a sheet of paper using a finite number of standard mathematical operations. For example, linear models have closed-form solutions IF the design covariance matrix is invertible, otherwise we obtain a solution using iterative optimisation.</p>
<p><img alt="" src="https://miro.medium.com/v2/1*gOOz91wskRFGwRqOMTgUtA.png" style="width:700px" /></p>
<p>Bayesian models do not typically have exact closed-form solutions for their posterior distributions. One thing that typically helps is choosing simple models, Gaussian likelihood functions and conjugate priors. A prior distribution is said to be <em>conjugate</em> to a likelihood function if the resulting posterior belongs to the same distribution family as the prior.</p>
<p><img alt="" src="https://miro.medium.com/v2/resize:fit:700/1*xrCq2K9jbBp84qYU5UD5oQ.png" style="height:251px; width:700px" /></p>
<p>Bayesian linear regression is a model that typically assumes Gaussian priors over both the regression coefficients and the likelihood function. When we update the prior with the observed data (using Bayes’ theorem), the resulting posterior distribution for the regression coefficients will also follow a normal distribution. This can be written down analytically and sampled using standard methods in Python.</p>
<p><a href="https://chleon.medium.com/tl-dr-approximate-inference-methods-made-easy-c9652947bc04"><strong>Visit Now</strong></a></p>