# Main take-away
## Wrap-up
- Hyperparameters have an impact on the models' performance and should be
wisely chosen;
- The search for the best hyperparameters can be automated with a grid-search
approach or a randomized search approach;
- A grid-search is expensive and does not scale when the number of
hyperparameters to optimize increase. Besides, the combination are sampled
only on a regular grid.
- A randomized-search allows a search with a fixed budget even with an
increasing number of hyperparameters. Besides, the combination are sampled
on a non-regular grid.
## To go further
You can refer to the following scikit-learn examples which are related to
the concepts approached during this module:
- [Example of a grid-search](https://scikit-learn.org/stable/auto_examples/model_selection/plot_grid_search_digits.html#sphx-glr-auto-examples-model-selection-plot-grid-search-digits-py)
- [Example of a randomized-search](https://scikit-learn.org/stable/auto_examples/model_selection/plot_randomized_search.html#sphx-glr-auto-examples-model-selection-plot-randomized-search-py)
- [Example of a nested cross-validation](https://scikit-learn.org/stable/auto_examples/model_selection/plot_nested_cross_validation_iris.html#sphx-glr-auto-examples-model-selection-plot-nested-cross-validation-iris-py)