Lecture 32: Markov chains (cont.), irreducibility, recurrence, transience, reversibility, random walk on an undirected network

Stat 110, Prof. Joe Blitzstein, Harvard University


Examples of Markov Chains

Markov chains are memoryless, in a way, since the past doesn't really inform the future; only the present counts. Recall that the future is conditionally independent of the past, given the present.

Some key concepts

  • A chain is irreducible if it is possible to get from any state to another.
  • A state is recurrent if, when starting there, the chain has probability of 1.0 for returning to that state. Note that if there is probability 1.0 for returning to a certain state, then it follows that in a Markov chain, you can return to that state infinitely many times with probability 1.0.
  • Otherwise, the state is transient.

Example 1

  • This Markov chain is irreducible, as it is indeed possible to go from any one state to another.
  • All of the states in this Markov chain are recurrent.

Example 2

  • In this example, the chain is reducible; notice how there are actually two chains (1-2-3 and 4-5-6).
  • However, note that all of the states are recurrent.

And if we connected states 3 and 6...

  • This example is still not irreducible.
  • But states 1, 2 and 3 are now transient, since there is no way to return to any of those states once that edge from 3 to 6 is traversed.
  • The chain would become irreducible and all states recurrent if we added yet another edge from 4 to 1.

Example 3

  • The Markov chain in this example is reducible.
  • States 1 and 2 are transient.
  • States 0 and 3 are recurrent, but once you reach states 0 or 3, you cannot leave; these states are called absorbing states.
  • In case you didn't notice, the Markov chain in this example is the Gambler's Ruin, where a player either loses all her money (say state 0) or wins all the money (state 3).

Example 4

  • This is a periodic Markov chain.
  • It is irreducible.
  • All states are recurrent.

Stationary Distributions

Recall the definition of a stationary distribution from the last lecture.

$\vec{s}$, a probability row vector (PMF), is stationary for a Markov chain with transition matrix $Q$ if $\vec{s} \, Q = \vec{s}$.

Theorems of Stationary Distributions

For any irreducible Markov chain with finitely many states:

  1. A stationary distribution $\vec{s}$ exists.
  2. It is unique.
  3. $\vec{s}_i = \frac{1}{r_i}$, where $r_i$ is the average return time for returning back to $i$.
  4. If we also assume there is no periodicity in the chain, where $Q^m$ is strictly positive for some $m$, then $P(X_n = i) \rightarrow s_i$ as $n \rightarrow \infty$

Regarding 4, if we any probability vector $\vec{t}$, then $\vec{t} \, Q \rightarrow \vec{s}$.

So the above theories of stationary distributions are worthy of study, since

  • they assure existence and uniqueness of stationary distribution under certain assumptions
  • they capture long-run behavior
  • show relation to average number of step for return to a state

But how would we compute the stationary distribution?

Reversible Markov Chains

Definition Markov chains with transition matrix $Q = \left[ q_{ij} \right]$ is reversible if there is a probability vector $\vec{s}$ such that $s_i \, q_{ij} = s_j \, q_{ji}$ for all states $i,j$.

Theorem: Reversible transition matrices and Stationary distribution

If a transition matrix is reversible with respect to $\vec{s}$, then that $\vec{s}$ is stationary. This reversibility is with reference to time, so it is also called time reversible.

For intuition, imagine a video tape of some particle changing states. If you ran that video backwards and show that to someone, and that person could not tell if the action was moving forwards or backwards, then that would be an example of time reversiblity.

Proof

Let $s_i \, q_{ij} = s_j \, q_{ji}$ for all $i,j$; show that $\vec{s} \, Q = \vec{s}$.

\begin{align} \sum_i s_i \, q_{ij} &= \sum_i s_j \, q_{ji} \\ &= s_j \sum_i q_{ji} \\ &= s_j &\text{ but this is just the definition of matrix multiplication} \\ \\\\ \Rightarrow \vec{s} \, Q &= \vec{s} \end{align}

Example of reversible Markov chain

A random walk on a undirected network is an example of a reversible Markov chain.

In the diagram above, the nodes 1 through 4 are joined in an undirected graph. The degree of each node $d_i$ is the number of edges emanating from said node, so $d_1=2, d_2=2, d_3=3, d_4=1$.

With transition matrix $Q$ for the graph above, then $d_i \, q_{ij} = d_j \, q_{ji}$.

Proof

Let $i \ne j$.

Then $q_{ij}, q_{ji}$ are either both 0 or both non-zero. The key is that we are talking about an undirected graph, and all edges are two-way streets.

If there is an edge joining $i,j$), then $q_{ij} = \frac{1}{d_i}$.

So in a graph with $M$ nodes $1, 2, \dots , M$, where each node has degree $d_i$, then $\vec{s}$ with $s_i = \frac{d_i}{\sum_{j} d_j}$ is stationary.