(R-Tutorial) Boosted Regression Trees
<p>Boosted Regression Trees (from now on BRTs) is a kind of regression methodology based on <em>Machine Learning</em>. Unlike conventional regression methods (GLMs, GAMs), BRTs combine numerous basic decision trees to enhance the predictive performance. BRTs can handle complex relationships and interactions among predictors, and it is considered a robust technique that can control outliers and nonlinearity. It is really useful to perform multiple kinds of analyses:</p>
<ul>
<li><strong>Variable selection: </strong>identify variables with the most explanatory power.</li>
<li><strong>Assess optimal conditions.</strong></li>
<li><strong>Predict new cases.</strong></li>
<li><strong>Morphological relationships and population trends.</strong></li>
</ul>
<p>In contrast to regression models, BRTs do not find a “single, best model”. BRTs imply a change in the paradigm of <em>p-value</em> and <em>finding the best model</em>. For example, in the case of using BRTs for variable selection (in which this tutorial will be based), we can obtain how relevant is each single predictor on our dependent variable. This new approach that BRTs provide have some pros and cons that must be considered:</p>
<p><a href="https://blog.devgenius.io/r-tutorial-boosted-regression-trees-9f243d88a921"><strong>Read More</strong></a></p>