Maximum likelihood estimation

Maximum likelihood estimation is a method that determines values for the parameters of a model. The parameter values are found such that they maximize the likelihood that the process described by the model produced the data that was actually observed.

Likelihood loosely means probability. Probability is a function of data given the parameter - likelihood is a function of the parameter given the data.

Maximum likelihood is a philosophy - it means that “that parameter is most ‘likely’ to be the correct parameter which gives the maximum value of the likelihood function”.

For instance, you have a coin which has a probability “p” of showing Heads (parameter). You toss it 10 times and observe 6 Heads (data). The sample proportion of heads (0.6) is the “most likely” value of “p”.

This maximum likelihood principle breaks down in many complex models. Hence, it is just one of many principles in statistics.