1. Definition
Let
be a stochastic process, taking on a finite or countable number of values.
is a DTMC if it has the Markov property: Given the present, the future is independent of the past
![Rendered by QuickLaTeX.com P{X_{n+1} = j| X_n = j, X_{n-1} = i_{n-1} ,cdots, X_1 = i_1, X_0 = i_0} = Pr{X_{n+1} = j| X_n = i}](http://mytechroad.com/wp-content/ql-cache/quicklatex.com-0fb61e9f9c8dbba1c9d763a74d9a4d4b_l3.png)
We define
, since
has the stationary transition probabilities, this probability is not depend on n.
Transition probabilities satisfy ![Rendered by QuickLaTeX.com sum_j p_{ij} = 1](http://mytechroad.com/wp-content/ql-cache/quicklatex.com-9c0e914df257e901520937979c0b7610_l3.png)
2. n Step transition Probabilities
![Rendered by QuickLaTeX.com P^{n+m}_{ij} = sum_k p^m_{kj} p^n_{ik}](http://mytechroad.com/wp-content/ql-cache/quicklatex.com-df65711f8f76af6b99e1ba3d6abea1e8_l3.png)
Proof:
3. Example: Coin Flips
4. Limiting Probabilities
Theorem: For an irreducible, ergodic Markov Chain,
exists and is independent of the starting state i. Then
is the unique solution of
and
.
Two interpretation for ![Rendered by QuickLaTeX.com pi_i](http://mytechroad.com/wp-content/ql-cache/quicklatex.com-a866997ccc73a8ec19b2aa50084b01a8_l3.png)
- The probability of being in state i a long time into the future (large n)
- The long-run fraction of time in state i
Note:
- If Markov Chain is irreducible and ergodic, then interpretation 1 and 2 are equivalent
- Otherwise,
is still the solution to
, but only interpretation 2 is valid.