Regularization and Geometry

<h1><strong>I. Bias-Variance Tradeoff</strong></h1> <p>When we perform statistical modeling, the goal is not to select a model that fits all the training data points and obtain the smallest error on the training data. The objective is to give the model the ability to generalize well on new and unseen data.</p> <p><img alt="" src="https://miro.medium.com/v2/resize:fit:491/1*EaLwMYXiXdB8iFQQigZPzw.png" style="height:262px; width:446px" /></p> <p>Bias and variance contributing to generalization error [1]</p> <p>As more and more parameters are added to a model, the complexity of the model increases, i.e., it can fit more noise in training data and leads to overfitting. This means that we are increasing the variance and decreasing the bias of the model. From the figure above, if we keep raising the model&rsquo;s complexity, the generalization error will eventually pass the optimal spot and continue to increase.</p> <p><a href="https://towardsdatascience.com/regularization-and-geometry-c69a2365de19"><strong>Learn More</strong></a></p>