Hyperparameter Optimization With Hyperopt — Intro & Implementation

<p><a href="https://github.com/hyperopt/hyperopt" rel="noopener ugc nofollow" target="_blank">Hyperopt</a>&nbsp;is an open-source hyperparameter optimization tool that I personally use to improve my machine learning projects and have found it to be quite easy to implement.&nbsp;Hyperparameter optimization, is the process of identifying the best combination of hyperparameters for a machine learning model to satisfy an objective function&nbsp;(this is usually defined as &ldquo;minimizing&rdquo; the objective function for consistency). To use a different analogy, each machine learning model comes with various knobs and levers that we can tune, until we get the outcome that we have been looking for from the model. The act of finding the right combination of hyperparameters that results in the outcome that we have been looking for is called hyperparameter optimization. Some examples of such parameters are: learning rate, architecture of a neural network (e.g. number of hidden layers), choice of the optimizer, etc.</p> <p><a href="https://medium.com/towards-data-science/hyperparameter-optimization-with-hyperopt-intro-implementation-dfc1c54d0ba7"><strong>Read More</strong></a></p>
Tags: optimization