Probability concepts explained: Bayesian inference for parameter estimation.

<p>In the previous blog post I covered the&nbsp;<a href="https://towardsdatascience.com/probability-concepts-explained-maximum-likelihood-estimation-c7b4342fdbb1" rel="noopener" target="_blank">maximum likelihood method for parameter estimation</a>&nbsp;in machine learning and statistical models. In this post we&rsquo;ll go over another method for parameter estimation using Bayesian inference. I&rsquo;ll also show how this method can be viewed as a generalisation of maximum likelihood and in what case the two methods are equivalent.</p> <p>Some fundamental knowledge of probability theory is assumed e.g. marginal and conditional probability. These concepts are explained in my&nbsp;<a href="https://medium.com/@jonnybrooks04/probability-concepts-explained-introduction-a7c0316de465" rel="noopener">first post in this series</a>. Additionally, it also helps to have some basic knowledge of a Gaussian distribution but it&rsquo;s not necessary.</p> <p><a href="https://towardsdatascience.com/probability-concepts-explained-bayesian-inference-for-parameter-estimation-90e8930e5348"><strong>Website</strong></a></p>