Regression, basically, means finding the best fit line/curve to your numerical data — a functional approximation of the data. That is, you want a mapping function of your input data to the output data (target). This mapping function is written as:
Ŷ = W*X + B
where B is the intercept and W is the slope of the line and Ŷ is the predicted output. The optimum values of W and B needs to be found to find the best fit line.
Ordinary Least Squares is an analytical solution to this linear regression model. By analytical, it means the exact solution is done by the numerical methods (formulas).
However, OLS can’t be applied in practical examples as it is not scalable to all the algorithms and huge amount of data. Therefore we need to ‘approximate’ this OLS solution by another method which iterates towards the optimal solution slowly.