Markov chain
Definition
We begin with a probability space . Let be a countable set, be a collection of random variables taking values in , be a stochastic matrix, and be a distribution. We call a Markov chain with initial distribution and transition matrix if:
-
1.
has distribution .
-
2.
For , .
That is, the next value of the chain depends only on the current value, not any previous values. This is often summed up in the pithy phrase, “Markov chains have no memory.”
As a special case of (2) we have that whenever . The values are therefore called transition probabilities for the Markov chain.
Discussion
Markov chains are arguably the simplest examples of random processes. They come in discrete and continuous versions; the discrete version is presented above.
Title | Markov chain |
Canonical name | MarkovChain |
Date of creation | 2013-03-22 12:37:32 |
Last modified on | 2013-03-22 12:37:32 |
Owner | Mathprof (13753) |
Last modified by | Mathprof (13753) |
Numerical id | 5 |
Author | Mathprof (13753) |
Entry type | Definition |
Classification | msc 60J10 |
Related topic | HittingTime |
Related topic | MarkovChainsClassStructure |
Related topic | MemorylessRandomVariable |
Related topic | LeslieModel |
Defines | Markov chain |
Defines | transition matrix |
Defines | transition probability |