First Step in Demystifying the Support Vector Machine (SVM): Learning and Implementing the Maximal Margin Classifier (MMC)
<p>Suppose we have a binary classification problem and presumably the data is linearly separable, we can define an infinite number of hyperplanes that distinctively separates them. But, how do we pick the optimal hyperplane?</p>
<p><img alt="" src="https://miro.medium.com/v2/resize:fit:700/0*WvCe2UMt86nKWq0H" style="height:1050px; width:700px" /></p>
<p>A peregrine falcon resting on top a tree branch. Photo by <a href="https://unsplash.com/@delaneyvan?utm_source=medium&utm_medium=referral" rel="noopener ugc nofollow" target="_blank">Delaney Van</a>.</p>
<p>The support vector machine is a generalization of a simple and intuitive classifier called the maximal margin classifier. As previously mentioned, this classifier cannot be applied to most data sets because it is only fit to classify classes that are separable by a linear boundary. Then, we also have the support vector classifier (SVC) which is a more flexible extension of the maximal margin classifier that cover a broader range of cases. Finally, the support vector machine is a further extension of the support vector classifier which introduces non-linear class boundaries. Now, we know that there are fine distinctions among these three which people often loosely and mistakenly refer as literal counterparts.</p>
<p><a href="https://medium.com/@manny.rooster/first-step-in-demystifying-the-support-vector-machine-svm-learning-and-implementing-the-maximal-9bd9a9b60e4d"><strong>Visit Now</strong></a></p>