Markov Chains
Discrete Stochastic Processes
A discrete stochastic process is a discrete system in which transitions occur randomly according to some probability distribution.
- Memoryless: The process is memoryless if the probability of a transition does not depend on the history of the process:
- Homogeneous: The process is homogeneous if the transition probability
does not depend on the time step
Finite Markov Chains
A finite Markov chain is a memoryless homogeneous stochastic process with a finite number of states
State Probabilities
The probability distribution of states at time step
Therefore,
which indicates that the distribution at each time step is well-defined as long as the initial distribution
We denote the probability
Stationary Distribution
A stationary distribution is a state vector
In other words,
Irreducible Markov Chains
A Markov chain is irreducible if for all
An irreducible Markov chain has a unique stationary state
where
In the following state diagram, State 0 and State 2 are not accessible to each other. Therefore, the Markov chain is not irreducible, and any
stateDiagram-v2 direction LR state "0" as zero state "1" as one state "2" as two zero --> zero : 1 one --> zero: q one --> two : p two --> two: 1
Aperiodic Markov Chains
The periodicity of a state
A Markov chain is aperiodic if all states have periodicity
For every irreducible, aperiodic (ergodic) Markov chain, independently of
Time-reversible Markov Chains
Consider an ergodic Markov chain
Using Bayes' theorem,
Continuous-time Markov Chains
Rather than a transition matrix, we have a transition probability density
A stationary distribution
The reverse transition probabilities is defined as