Markov Chain

Back to Computer-Science-Concepts

To understand Markov Chain, need some probability theory background:

Some Probablity Theory

A Famous Example

Drunkard's Walk is a Markov Chain, where you randomly walk +1 or -1 on a number line with equal (0.5) probability. It does not depend on how you got to the current number, the probability of +1 or -1 is always equal.

Markov Chains

Set of states, which transition betweens states based on stochastic probability. For a game of snakes and ladders, you can build a n x n transition matrix, for n states. Must my memoryless to create an accurate model.

Formal Definition

A Markov chain is a sequence of random variables X1, X2, X3 ... with the Markov property (the probability of moving to the next state depends only on the present state and not previous states):

Pr(Xn+1 = x | X1 = x1, X2 = x2, ..., Xn = xn) = Pr(Xn+1 = x | Xn = xn)
In other words, state n+1 is only dependent on state n