Probability concepts explained: Bayesian inference for parameter estimation.
<p>In the previous blog post I covered the <a href="https://towardsdatascience.com/probability-concepts-explained-maximum-likelihood-estimation-c7b4342fdbb1" rel="noopener" target="_blank">maximum likelihood method for parameter estimation</a> in machine learning and statistical models. In this post we’ll go over another method for parameter estimation using Bayesian inference. I’ll also show how this method can be viewed as a generalisation of maximum likelihood and in what case the two methods are equivalent.</p>
<p>Some fundamental knowledge of probability theory is assumed e.g. marginal and conditional probability. These concepts are explained in my <a href="https://medium.com/@jonnybrooks04/probability-concepts-explained-introduction-a7c0316de465" rel="noopener">first post in this series</a>. Additionally, it also helps to have some basic knowledge of a Gaussian distribution but it’s not necessary.</p>
<p><a href="https://towardsdatascience.com/probability-concepts-explained-bayesian-inference-for-parameter-estimation-90e8930e5348"><strong>Website</strong></a></p>