# Is the error linear in terms of $\lambda$ in ridge regression?

by AspiringMat   Last Updated January 12, 2018 21:19 PM

I have a data set of $(X_n, t_n)$ that I am doing a ridge regression on.

I am also doing $10$ fold cross validation to fine tune the value of $\lambda$ for values from $0$ to $4.0$ in $0.1$ step increments. However, after I did the cross validation and plotted the values of $\lambda$ against the ridge error function: $\sum_{n=1}^N(t_n-w^TX_n)^2 + \frac{\lambda}{2}||w||^2$, I got this curve:

I plotted a scatter plot of the points ($X_n$ is in $\mathbb{R}^2$) and the points looked VERY close to all lying on the same plane.

Could it really be that this is how the error varies as $\lambda$ varies or is there something wrong in my implementation?

Tags :