Intro
Linear Regresion is a fundamental method in machine learning and statistics that models the relationship between a dependent variable and one or more independent variables by fitting a straight line to the data. The simplest form is represented by y = mx + b, where y is the dependent variable, x is the independent variable, m is the slope, and b is the y-intercept.
![]()
The goal of linear regression is to find a straight line that minimizes the error (the difference) between the observed data points and the predicted values. This line helps us predict the dependent variable for new, unseen data.
The goal is to find the values for m (slope AKA ) and b (bias AKA intercept) to get the best-fit line. The best fit line, known as the least-squares method, is the straight line that minimizes the sum of the squared differences between the actual data points and the predicted values on the line.




