# Main take-away
## Wrap-up
In this module, we presented decision trees in details. We saw that they:
- are suited for both regression and classification problems;
- are non-parametric models;
- are not able to extrapolate;
- are sensitive to hyperparameter tuning.
## To go further
You can refer to the following scikit-learn examples which are related to
the concepts approached during this module:
- [Example of decision tree regressor](https://scikit-learn.org/stable/auto_examples/tree/plot_tree_regression.html#sphx-glr-auto-examples-tree-plot-tree-regression-py)
- [Example of decision tree classifier](https://scikit-learn.org/stable/auto_examples/tree/plot_iris_dtc.html#sphx-glr-auto-examples-tree-plot-iris-dtc-py)
- [Understanding the tree structure in scikit-learn](https://scikit-learn.org/stable/auto_examples/tree/plot_unveil_tree_structure.html#sphx-glr-auto-examples-tree-plot-unveil-tree-structure-py)
- [Post-pruning decision trees](https://scikit-learn.org/stable/auto_examples/tree/plot_cost_complexity_pruning.html#sphx-glr-auto-examples-tree-plot-cost-complexity-pruning-py)