How To Minimize The Cost Function via Gradient Descent - By Anish Arya (Machine Learning and Reinforcement Learning Consultant)

Are you tired of defining Learning Rate (alpha) to minimize the Cost Function via Gradient Descent?

Here’s a solution code to calculate theta values in Octave to reach the minimized Cost Function using the Normal Equation method without worrying about alpha.

function theta = calThetaForNormalEquation(X, y)
theta = pinv(X'*X)*X'*y;

Note: Here X is the Design Matrix of size m(n+1) with m being the number of training examples and n being the number of attributes, and y is the Column Vector of Labels of size m*1

The result “theta” will be a Column Vector of size (n+1)*1