Entropy: How Decision Trees Make Decisions
<p>You’re a Data Scientist in training. You’ve come a long way from writing your first line of Python or R code. You know your way around Scikit-Learn like the back of your hand. You spend more time on Kaggle than Facebook now. You’re no stranger to building awesome random forests and other tree based ensemble models that get the job done. However , you’re nothing if not thorough. You want to dig deeper and understand some of the intricacies and concepts behind popular machine learning models. Well , so do I.</p>
<p>In this blog post, I will introduce the concept of Entropy as a general topic in statistics which will allow me to further introduce the concept of Information Gain and subsequently explain why both these fundamental concepts form the basis of how decision trees build themselves on the data we supply them.</p>
<p><a href="https://towardsdatascience.com/entropy-how-decision-trees-make-decisions-2946b9c18c8"><strong>Click Here</strong></a></p>