📝
Awesome-ml-book
  • Awesome Machine Learning Algorithms
  • Logistic Regression
  • Linear Regression
  • K means Clustering
  • Ridge Regression
Powered by GitBook
On this page
  • Ridge regression
  • Regression

Was this helpful?

Ridge Regression

PreviousK means Clustering

Last updated 4 years ago

Was this helpful?

Cannot retrieve contributors at this time

Ridge regression

Regression

Ridge regression addresses some of the problems of ordinary_least_squares by imposing a penalty on the size of the coefficients. The ridge coefficients minimize a penalized residual sum of squares:

The complexity paramete controls the amount of shrinkage the larger the value of , the greater the amount of shrinkage and thus the coefficients become more robust to collinearity.

As with other linear models, Ridge will take in its fit method arrays X, y and will store the coefficients w of the linear model in its coef_ member:

    >>> from sklearn import linear_model
    >>> reg = linear_model.Ridge(alpha=.5)
    >>> reg.fit([[0, 0], [0, 0], [1, 1]], [0, .1, 1])
    Ridge(alpha=0.5)
    >>> reg.coef_
    array([0.34545455, 0.34545455])
    >>> reg.intercept_
    0.13636...
Permalink