πŸ“ Exercise M6.01ΒΆ

The aim of this notebook is to investigate if we can tune the hyperparameters of a bagging regressor and evaluate the gain obtained.

We will load the California housing dataset and split it into a training and a testing set.

from sklearn.datasets import fetch_california_housing
from sklearn.model_selection import train_test_split

data, target = fetch_california_housing(as_frame=True, return_X_y=True)
target *= 100  # rescale the target in k$
data_train, data_test, target_train, target_test = train_test_split(
    data, target, random_state=0, test_size=0.5)

Note

If you want a deeper overview regarding this dataset, you can refer to the Appendix - Datasets description section at the end of this MOOC.

Create a BaggingRegressor and provide a DecisionTreeRegressor to its parameter base_estimator. Train the regressor and evaluate its generalization performance on the testing set using the mean absolute error.

# Write your code here.

Now, create a RandomizedSearchCV instance using the previous model and tune the important parameters of the bagging regressor. Find the best parameters and check if you are able to find a set of parameters that improve the default regressor still using the mean absolute error as a metric.

Tip

You can list the bagging regressor’s parameters using the get_params method.

# Write your code here.

We see that the predictor provided by the bagging regressor does not need much hyperparameter tuning compared to a single decision tree.