# β Quiz M4.03#

Question

Which of the following estimators can solve linear regression problems?

a) sklearn.linear_model.LinearRegression

b) sklearn.linear_model.LogisticRegression

c) sklearn.linear_model.Ridge

*Select all answers that apply*

Question

Regularization allows:

a) to create a model robust to outliers (samples that differ widely from other observations)

b) to reduce overfitting by forcing the weights to stay close to zero

c) to reduce underfitting by making the problem linearly separable

*Select a single answer*

Question

A ridge model is:

a) the same as linear regression with penalized weights

b) the same as logistic regression with penalized weights

c) a linear model

d) a non linear model

*Select all answers that apply*

Question

Assume that a data scientist has prepared a train/test split and plans to use
the test for the final evaluation of a `Ridge`

model. The parameter `alpha`

of
the `Ridge`

model:

a) is internally tuned when calling

`fit`

on the train setb) should be tuned by running cross-validation on a

**train set**c) should be tuned by running cross-validation on a

**test set**d) must be a positive number

*Select all answers that apply*

Question

Scaling the data before fitting a model:

a) is often useful for regularized linear models

b) is always necessary for regularized linear models

c) may speed-up fitting

d) has no impact on the optimal choice of the value of a regularization parameter

*Select all answers that apply*

Question

The effect of increasing the regularization strength in a ridge model is to:

a) shrink all weights towards zero

b) make all weights equal

c) set a subset of the weights to exactly zero

d) constrain all the weights to be positive

*Select all answers that apply*

Question

By default, a `LogisticRegression`

in scikit-learn applies:

a) no penalty

b) a penalty that shrinks the magnitude of the weights towards zero (also called βl2 penaltyβ)

c) a penalty that ensures all weights are equal

*Select a single answer*

Question

The parameter `C`

in a logistic regression is:

a) similar to the parameter

`alpha`

in a ridge regressorb) similar to

`1 / alpha`

where`alpha`

is the parameter of a ridge regressorc) not controlling the regularization

*Select a single answer*

Question

In logistic regression, increasing the regularization strength (by
decreasing the value of `C`

) makes the model:

a) more likely to overfit to the training data

b) more confident: the values returned by

`predict_proba`

are closer to 0 or 1c) less complex, potentially underfitting the training data

*Select a single answer*