January 29, 2013

## Markov Chain In mathematics, a Markov [mahr-kahvchain, named after Russian mathematician Andrey Markov (1856 – 1922), is a discrete (finite or countable) random process with the Markov property (the memoryless property of a stochastic [random] process). A discrete random process means a system which can be in various states. The system also changes randomly in discrete steps. It can be helpful to think of the system as evolving through discrete steps in time, although strictly speaking the ‘step’ may have nothing to do with time.

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it. (Given two jointly distributed random variables X and Y, the conditional probability distribution of Y given X is the probability distribution of Y when X is known to be a particular value.) The term ‘Markov assumption’ is used to describe a model where the Markov property is assumed to hold, such as a hidden Markov model (in which the system being modeled is assumed to be a Markov process with unobserved [hidden] states). 