1. 程式人生 > >Machine Learning week 3 quiz : Regularization

Machine Learning week 3 quiz : Regularization

Regularization

5 試題

1. 

You are training a classification model with logistic

regression. Which of the following statements are true? Check

all that apply.

Adding many new features to the model helps prevent overfitting on the training set.

Introducing regularization to the model always results in equal or better performance on the training set.

Adding a new feature to the model always results in equal or better performance on the training set.

Introducing regularization to the model always results in equal or better performance on examples not in the training set.

2. 

Suppose you ran logistic regression twice, once withλ

=0, and once withλ=1. One of the times, you got

parametersθ=[23.437.9], and the other time you got

θ=[1.030.28]. However, you forgot which value of

λcorresponds to which value ofθ. Which one do you

think corresponds toλ=1?

θ=[1.030.28]

θ=[23.437.9]

3. 

Which of the following statements about regularization are

true? Check all that apply.

Because logistic regression outputs values0hθ(x)1, it's range of output values can only be "shrunk" slightly by regularization anyway, so regularization is generally not helpful for it.

Using a very large value ofλcannot hurt the performance of your hypothesis; the only reason we do not setλto be too large is to avoid numerical problems.

Using too large a value ofλcan cause your hypothesis to overfit the data; this can be avoided by reducingλ.

Consider a classification problem. Adding regularization may cause your classifier to incorrectly classify some training examples (which it had correctly classified when not using regularization, i.e. whenλ=0).

4. 

In which one of the following figures do you think the hypothesis has overfit the training set?

Figure:

Figure:

Figure:

Figure:

5. 

In which one of the following figures do you think the hypothesis has underfit the training set?

Figure:

Figure:

Figure:

Figure: