Linear Regression and Regularized Linear Regression

  1. If the residual error for a point is -1 and for the other, it is 1 then merely adding them gives 0 error which means the line fits perfectly to the points. This is not true.
  2. Squaring leads to more importance given to larger errors and less to smaller errors. Intuitively, model weights quickly update to minimize larger errors more than smaller ones.
  1. The independent variables have a linear relationship with the dependent variable.
  2. The variance of the dependent variable is uniform across all combinations of Xs
  3. The error term e associated with Y and Y’ is independent and identically distributed.

Linear Relationship

The variance of the dependent variable is uniform across all combinations of Xs

The error term e associated with Y and Y’ is independent and identically distributed

Evaluating a Model

Motivation

Least Absolute Shrinkage and Selector Operator (LASSO)

Ridge

Analysis

Conclusion

--

--

--

Data Scientist

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

The Architecture Behind DeepMind’s Model for Near Real-Time Weather Forecasts

Machine Learning(What is Machine Learning)

How to make custom maps for free with Google Maps

A mathematical model and forecast for the coronavirus disease COVID-19 in Germany

Moving Away From R²

Hi friends Every Wednesday, we’ll round up the most interesting news in marketing, data science…

Rescue your solar data from dashboard and KPI vendors

Re-thinking the ‘Good’ in ‘Good-Cheap-Fast’

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Himanshu Sharma

Himanshu Sharma

Data Scientist

More from Medium

Explaining Linear Regression

Linear Regression : ML or Stats?jk

Neural Networks: Logistic Regression

Ensemble Techniques!