Markov
Chain
Introductio
n
STAT 410
RANDOM PROCESS
❑We can define a random process (also called stochastic process) as a collection of random variables
indexed by a set T that often represent different instants of time (we will assume that in the following).
❑The two most common cases are: either T is the set of natural numbers (discrete time random process)
or T is the set of real numbers (continuous time random process).
❑For example, flipping a coin every day defines a discrete time random process whereas the price of a
stock market option varying continuously defines a continuous time random process.
❑The random variables at different instant of time can be independent to each other (coin flipping
example) or dependent in some way (stock price example) as well as they can have continuous or discrete
state space (space of possible outcomes at each instant of time).
RANDOM PROCESS -
EXAMPLES
There exists some well known families of random processes:
❑ Gaussian processes,
❑ Poisson processes,
❑ Autoregressive models,
❑ Moving-average models,
❑ Markov chains and others.
These particular cases have, each, specific properties that allow us to better study and understand them.
One property that makes the study of a random process much easier is the
“Markov property”.
In a very informal way, the Markov property says, for a random process, that if
we know the value taken by the process at a given time, we won’t get any
additional information about the future behavior of the process by gathering
more knowledge about the past.
Stated in slightly more mathematical terms, for any given time, the conditional
distribution of future states of the process given present and past states
depends only on the present state and not at all on the past states
(memoryless property).
A random process with the Markov property is called Markov process.
State Transition Matrix
Figure 1
State Transition Diagram
A Markov chain is usually shown by a state transition
diagram.
Consider a Markov chain with three possible states 1, 2, and 3 Figure 2
and the following transition probabilities
Figure 2 shows the state transition diagram for the above Markov
chain. In this diagram, there are three possible states 1, 2, and 3,
and the arrows from each state to other states show the transition
probabilities pij. When there is no arrow from state i to state j, it
means that pij=0.
Example 1
Consider the given Markov chain
Figure 1
Figure 2
Example 1 (Solution)
Example 2
Example 2
(Solution)
End of slide