0% found this document useful (0 votes)
4 views3 pages

01 Hidden Markov Models

Hidden Markov Models (HMMs) are statistical models that use unobserved states to influence observable events, with key components including hidden states, observations, and transition probabilities. HMMs have various applications such as speech recognition, bioinformatics, and natural language processing. Key algorithms associated with HMMs include the forward algorithm, backward algorithm, and Viterbi algorithm for calculating probabilities and determining the most probable sequences.

Uploaded by

meghanaalluri2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views3 pages

01 Hidden Markov Models

Hidden Markov Models (HMMs) are statistical models that use unobserved states to influence observable events, with key components including hidden states, observations, and transition probabilities. HMMs have various applications such as speech recognition, bioinformatics, and natural language processing. Key algorithms associated with HMMs include the forward algorithm, backward algorithm, and Viterbi algorithm for calculating probabilities and determining the most probable sequences.

Uploaded by

meghanaalluri2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Hidden Markov Models

1. Introduction to Hidden Markov Models (HMMs)


Definition and Components
A Hidden Markov Model (HMM) is a statistical model in which the system being modeled
is assumed to follow a Markov process with unobserved (hidden) states. HMMs are
particularly useful in areas where the goal is to learn about sequences and time series
data. The main components of an HMM are:
1. Hidden States (Q): A set of states in the model. These states are not directly
visible, but they influence the visible events (observations).
2. Observations (O): A set of observations or outputs. Each observation is a
result of some hidden state.
3. State Transition Probabilities (A): The probabilities of transitioning from one
state to another.
4. Observation Likelihoods (B): The probability of an observation given a state.
5. Initial State Probabilities (π): The probabilities of the system starting in each
state.
Types of HMMs
 Discrete HMM: Observations are discrete symbols.
 Continuous HMM: Observations are continuous values.
 Mixture HMM: Observations are modeled using a mixture of probability
distributions.
Applications
HMMs are used in various applications, including:
 Speech recognition
 Bioinformatics (e.g., gene prediction)
 Financial modeling
 Natural language processing
 Activity recognition

2. Mathematical Representation of HMMs


States and Observations
 States (Q):

 Observations (O):
3. Algorithms in HMMs
Forward Algorithm
The forward algorithm calculates the probability of observing a sequence of
events up to a certain point in time. It is used to evaluate the likelihood of a
given observation sequence.

Backward Algorithm
The backward algorithm calculates the probability of the ending portion of the
observation sequence, given the hidden state at a particular time. It is used in
conjunction with the forward algorithm for efficient computation.
Viterbi Algorithm
The Viterbi algorithm finds the most probable sequence of hidden states that
results in a sequence of observed events. It is used for decoding and is based on
dynamic programming.

You might also like