Ridge Regression( Penalty)


Defination and Ideas:

  • In ridge Regression, the cost function is altered by adding a penalty equivalent to square of the magnitude of the coefficients:

    • equivalent to saying minimizing the cost function in equation under the condition that for some c>0,
  • The ridge regression puts constraint on the coefficients(). Then penalty term() regularizes the coefficients such that if the coefficients take large values the optimization function is penalized.

  • We taking the correlation of the matrix and add constant

1+e & & & \ & 1+e & & \ & & 1+e & \ & & & 1+e \end{matrix}| $$

Issue with Ridge Regression:

  • While it can have better prediction error than linear regression

  • it workout best when there was a subset of the true coefficients that are samll or zero.

  • it will never sets coefficients to zero exactly, and therefore cannot perform variable selection in the linear model


Apply Ridge Regression: R Code:

Python Code: