Overfitting, Underfitting, and Regularization
<p>Let’s say you have a <a href="http://bit.ly/quaesita_emperorm" rel="noopener ugc nofollow" target="_blank">model</a> that is as good as you’re going to get for the information you have.</p>
<p>To have an even better model, you need better <a href="http://bit.ly/quaesita_hist" rel="noopener ugc nofollow" target="_blank">data</a>. In other words, more data (quantity) or <em>more relevant</em> data (quality).</p>
<p>When I say as <em>good </em>as you’re going to get, I mean in “good” terms of <a href="http://bit.ly/quaesita_msefav" rel="noopener ugc nofollow" target="_blank">MSE</a> performance on data your model hasn’t seen before. (It’s supposed to <a href="http://bit.ly/quaesita_parrot" rel="noopener ugc nofollow" target="_blank"><em>pre</em>dict</a>, not <em>post</em>dict.) You’ve done a perfect job of getting what you can from the information you have — the rest is error you can’t do anything about with your information.</p>
<p><a href="https://towardsdatascience.com/overfitting-underfitting-and-regularization-7f83dd998a62"><strong>Visit Now</strong></a></p>