Ad Space — Top Banner

Markov Chain Formula

Markov chains model systems that transition between states with fixed probabilities.
Learn transition matrices and steady-state formulas.

The Formula

π(n) = π(0) × Pⁿ

A Markov chain is a mathematical system that transitions from one state to another according to fixed probabilistic rules. The key property (called the Markov property) is that the future state depends only on the current state, not on the history of how it got there.

Markov chains are named after Russian mathematician Andrey Markov, who first studied them in 1906. The system is fully described by a transition matrix P, where each entry Pᵢⱼ gives the probability of moving from state i to state j. Each row of the transition matrix sums to 1.

After n steps, the state distribution is found by multiplying the initial distribution π(0) by the transition matrix P raised to the n-th power. Many Markov chains converge to a steady-state distribution π where πP = π, meaning the probabilities stop changing over time. This steady state is independent of the starting distribution.

Variables

SymbolMeaning
π(n)State probability distribution after n steps
π(0)Initial state distribution
PTransition matrix (rows sum to 1)
PᵢⱼProbability of transitioning from state i to state j
nNumber of steps (transitions)

Example 1

Weather follows a Markov chain: if sunny today, 70% chance sunny tomorrow and 30% chance rainy. If rainy today, 40% chance sunny and 60% chance rainy. If today is sunny, what is the weather probability for 2 days from now?

Transition matrix P: [0.7, 0.3; 0.4, 0.6]. Initial state π(0) = [1, 0] (sunny)

After 1 day: π(1) = [1, 0] × P = [0.7, 0.3]

After 2 days: π(2) = [0.7, 0.3] × P = [0.7×0.7 + 0.3×0.4, 0.7×0.3 + 0.3×0.6]

π(2) = [0.49 + 0.12, 0.21 + 0.18]

π(2) = [0.61, 0.39] — 61% chance of sunny, 39% chance of rainy

Example 2

Using the same weather model, find the steady-state distribution.

At steady state: πP = π and π_S + π_R = 1

π_S = 0.7π_S + 0.4π_R and π_R = 0.3π_S + 0.6π_R

From the first equation: 0.3π_S = 0.4π_R, so π_S = (4/3)π_R

Substituting into π_S + π_R = 1: (4/3)π_R + π_R = 1, so (7/3)π_R = 1

π_R = 3/7 ≈ 0.429, π_S = 4/7 ≈ 0.571. Long-term: 57.1% sunny, 42.9% rainy

When to Use It

Markov chains model any process where future outcomes depend only on the current state.

  • Weather prediction models
  • Google's PageRank algorithm for web search ranking
  • Stock market modeling and financial risk analysis
  • Board game probability (e.g., expected moves in Monopoly)

Ad Space — Bottom Banner

Embed This Calculator

Copy the code below and paste it into your website or blog.
The calculator will work directly on your page.