Linear regression Cost Function
In any optimization problem, you need a measure that enables you to compare one solution with another. This is typically called a cost function.
In the case of linear regression, the cost function is the sum of the squares of the residuals (residuals being the difference between the dependent variable value and the value given by the model). The procedure finds for you the intercept and slope(s) that minimize the cost function.
This is why the linear regression method is also called “Least-squares regression”.