Markov chain
Definition
We begin with a probability space^{} $(\mathrm{\Omega},\mathcal{F},\mathbb{P})$. Let $I$ be a countable set, $({X}_{n}:n\in \mathbb{Z})$ be a collection^{} of random variables^{} taking values in $I$, $\mathbf{T}=({t}_{ij}:i,j\in I)$ be a stochastic matrix, and $\lambda $ be a distribution^{}. We call ${({X}_{n})}_{n\ge 0}$ a Markov chain^{} with initial distribution $\lambda $ and transition matrix $\mathbf{T}$ if:

1.
${X}_{0}$ has distribution $\lambda $.

2.
For $n\ge 0$, $\mathbb{P}({X}_{n+1}={i}_{n+1}{X}_{0}={i}_{0},\mathrm{\dots},{X}_{n}={i}_{n})={t}_{{i}_{n}{i}_{n+1}}$.
That is, the next value of the chain depends only on the current value, not any previous values. This is often summed up in the pithy phrase, “Markov chains have no memory.”
As a special case of (2) we have that $\mathbb{P}({X}_{n+1}=i{X}_{n}=j)={t}_{ij}$ whenever $i,j\in I$. The values ${t}_{ij}$ are therefore called transition probabilities for the Markov chain.
Discussion
Markov chains are arguably the simplest examples of random processes. They come in discrete and continuous versions; the discrete version is presented above.
Title  Markov chain 
Canonical name  MarkovChain 
Date of creation  20130322 12:37:32 
Last modified on  20130322 12:37:32 
Owner  Mathprof (13753) 
Last modified by  Mathprof (13753) 
Numerical id  5 
Author  Mathprof (13753) 
Entry type  Definition 
Classification  msc 60J10 
Related topic  HittingTime 
Related topic  MarkovChainsClassStructure 
Related topic  MemorylessRandomVariable 
Related topic  LeslieModel 
Defines  Markov chain 
Defines  transition matrix 
Defines  transition probability 