Main take-away#
Wrap-up#
Hyperparameters have an impact on the modelsβ performance and should be wisely chosen;
The search for the best hyperparameters can be automated with a grid-search approach or a randomized search approach;
A grid-search can be computationally expensive and becomes less attractive as the number of hyperparameters to explore increases. Moreover, the combinations are sampled on a fixed, regular grid.
A randomized-search allows exploring within a fixed budget, even as the number of hyperparameters increases. In this case, combinations can be sampled either on a regular grid or from a given distribution.
To go further#
You can refer to the following scikit-learn examples which are related to the concepts approached during this module: