Story on Linear Regression

Linear Regression fits a linear model with equation of y=mx+c where m is the coefficient and c is the intercept. It does so by minimizing the residual sum of squares between the observed targets in the dataset, and the targets predicted by the linear approximation.

The linear model is fitted using the Ordinary Least Squares method. This method minimizes the sum of squared residuals, which are the differences between the observed and predicted values. The coefficients are calculated using the formula:

β = (X^T * X)^(-1) * X^T * y

where β is the vector of coefficients, X is the matrix of input features, and y is the vector of target values. The intercept c is calculated as:

c = ȳ - β * x̄

where is the mean of the target values and is the mean of the input features.

Building a linear regression model involves understanding these concepts and applying them to your data. With this foundation, you can effectively use linear regression for predictive modeling and data analysis.

It is a supervised machine learning and statistical technique used to model the relationship between a dependent variable(output) and one or more independent variables(inputs) by fitting a linear equation to observed data.

When to use Linear Regression?


Do check out my github link here for my project on Linear Regression.