1.You are training a classification model with logistic
regression. Which of the following statements are true? Check
all that apply.
A:Adding many new features to the model helps prevent overfitting on the training set.
B:Introducing regularization to the model always results in equal or better performance on the training set.
C:Adding a new feature to the model always results in equal or better performance on the training set.
D:Introducing regularization to the model always results in equal or better performance on examples not in the training set.
选C
2。
Suppose you ran logistic regression twice, once with λ=0, and once with λ=1. One of the times, you got
parameters θ=[74.8145.05], and the other time you got
θ=[1.370.51]. However, you forgot which value of
λ corresponds to which value of θ. Which one do you
think corresponds to λ=1?
加入λ会让θ变小,所以选择θ=[1.370.51]
3。
Which of the following statements about regularization are
true? Check all that apply.
A:Because logistic regression outputs values 0≤hθ(x)≤1, it's range of output values can only be "shrunk" slightly by regularization anyway, so regularization is generally not helpful for it.
B:Using a very large value of λ cannot hurt the performance of your hypothesis; the only reason we do not set λ to be too large is to avoid numerical problems.
C:Using too large a value of λ can cause your hypothesis to overfit the data; this can be avoided by reducing λ.
D:Consider a classification problem. Adding regularization may cause your classifier to incorrectly classify some training examples (which it had correctly classified when not using regularization, i.e. when λ=0).
选D
4。
In which one of the following figures do you think the hypothesis has overfit the training set?
overfit过拟合,说的是采用了过多的因素,每个点都契合到了。
5。
In which one of the following figures do you think the hypothesis has underfit the training set?
underfit 说的是画出的函数曲线只经过了较少的点。
网友评论