Describe Markov chains?

Describe Markov chains?

ref: https://towardsdatascience.com/introduction-to-markov-chains-50da3645a50d

A Markovian Chain is a stochastic process that follows the Markovian property: Given the present, the past is irrelevant to know what will happen in the future.

In other words, when applying the rule of total probability and conditioning on previous events, you drop everything that has happened in the past as long as you know what has happened in the present. This reasoning is useful in many applications such as random walks, queueing theory and certain state-based transition processes that can be used in robotic systems.

I talked to a girl the other day about this. She said the most practical applications in real life sometimes is in relationships. When you meet a possible soul-mate (girlfriend/boyfriend), the only thing you should have in mind is the present to take decisions of the future, since getting to know the other persons past (or your own past) can sometimes be painful for both! (then again we’re both young and foolish so we have a bias)

A more logical example is the memoryless property for the casino fans over there who like to play roulette. No matter how many times, the color red has appeared in a row, your next best bet is still 1/2 and 1/2 for red and black colors, given that you assume that the mechanical properties of the wheel are regular and there are no rigged predictable forces induced by the wheel turner. Without the Markovian property, people would be inclined to keep betting on red, thinking they’d be more likely to win.