Markov Chains
o In this section, we will study a concept that utilizes a
mathematical model that combines probability and matrices to
analyze what is called a stochastic process, which consists of a
sequence of trials satisfying certain conditions.
o The sequence of trials is called a Markov Chain which is named
after a Russian mathematician called Andrei Markov (1856-1922).
1
Andrei Markov (1856-1922)
Markov is particularly remembered
for his study of Markov chains,
sequences of random variables in
which the future variable is
determined by the present
variable but is independent of the
way in which the present state
arose from its predecessors. This
work launched the theory of
stochastic processes
2
Example
An employee tried not to be late for his office too
often. If he is late one day, he is 80% sure to be on
time next time. If he is on time, then next time there
is 20% chance of being late. In the long run how
often he is late for the office
Stochastic Process: Change of behaviour of factor
state over the time
3
Stochastic Processes
4
Transition Probabilities
5
Notation
6
Markov Property
7
Transition Probability Matrix (TPM)
8
Transition probability matrix: Rows indicate the current state and
column indicate the transition . For example, given the current state of
A, the probability of going to the next state A is s. Given the current
state A’, the probability of going from this state to A is r. Notice that the
rows sum to 1. We will call this matrix P.
9
Notation
10
Notation
11
Initial State distribution matrix:
A A'
S0 t 1 t
12
First and second state matrices:
If we multiply the Initial state matrix by the transition
matrix, we obtain the first state matrix.
S1 So P
If the first state matrix is multiplied by the transition
matrix we obtain the second state matrix:
2
S 2 S1 P So P P S0 P
13
Kth – State matrix
If this process is repeated we will obtain the following
expression:
The entry in the ith row and jth column indicates the
probability of the system moving from the ith state to the
jth state in k observations or trials.
Sk Sk 1 P S0 P k
14
Example
15
Solution:
16
17
18
`
19
Problem 2
The TPM of the Markov chain with
three states 0,1and 2
And the initial probability is
Calculate i) P(X3=2, X2=1, X1=0, X0=2)
ii) P(X3=2, X1=0, X0=2)
iii) P (X2=2)
20
21
22
An example: An insurance company classifies drivers as low-risk if they are accident-
free for one year. Past records indicate that 98% of the drivers in the low-risk category (L)
will remain in that category the next year, and 78% of the drivers who are not in the low-
risk category ( L’) one year will be in the low-risk category the next year.
1. Find the transition matrix, P
L L'
L 0.98 0.02
P
L ' 0.78 0.22
2. If 90% of the drivers in the community are in the low-risk category
this year, what is the probability that a driver chosen at random
from the community will be in the low-risk category the next year?
The year after next ? (answer 0.96, 0.972 from matrices)
L L' S2 S1 P So P P S0 P 2
S0 0.90 0.10
S1 So P L L'
L L'
S2 0.972 0.028
S1 0.96 0.04
23
Finding the kth State matrix
k
Use the formula
k S S P
0
to find the 4th state
matrix for the previous problem.
S4 S0 P 4
4
0.98 0.02
0.90 0.10 =
0.78 0.22
.97488 0.02512
after four states, the percentage of low-risk drivers has
increased to .97488
24
25