Open In App

Conditional Probability vs Bayes Theorem

Last Updated : 23 Jul, 2025
Comments
Improve
Suggest changes
1 Likes
Like
Report

Conditional probability and Bayes' Theorem are two imprtant concepts in probability where Bayes theorem is generalized version of conditional probability. Conditional probability is the probability of an event occurring given that another event has already occurred. Bayes' Theorem, named after the 18th-century mathematician Thomas Bayes, extends the idea of conditional probability. It provides a way to update the probability of a hypothesis based on new evidence.

In this article, we will discuss the about both the concepts and understand what are the differences between them. Also, we will learn about where to use which concept.

What is Conditional Probability?

Conditional probability is the probability of an event occurring given that another event has already occurred. It is a measure that quantifies how the likelihood of an event changes when we have additional information about another related event.

For example, If you draw a card from a standard deck of 52 cards, the probability of drawing a king is 4/52​. However, if you know the card is a face card (king, queen, or jack), the conditional probability that the card is a king given that it is a face card is 4/12​ because there are 12 face cards in total.

Formula for Conditional Probability

For two events A and B, with P(B) > 0, the conditional probability of A given B, denoted as P(A∣B), is defined as:

P(A|B) = \frac{P(A \cap B)}{P(B)}

This formula states that the probability of event A occurring given that event B has occurred is equal to the probability of both events A and B occurring divided by the probability of event B occurring.

What is Bayes Theorem?

Bayes' Theorem, named after the 18th-century mathematician Thomas Bayes, is a fundamental result in probability theory that describes how to update the probabilities of hypotheses when given evidence. It provides a way to revise existing predictions or theories (updating probabilities) given new or additional evidence.

Example where Bayes Theorem can be used is:

A patient takes a test for a disease that affects 1% of the population. The test is 99% accurate, meaning it correctly identifies 99% of diseased individuals and has a 1% false positive rate.

If we want to find the probability that the patient has the disease given a positive test, we can use Bayes theorem here.

Another example include email spam filter: An email spam filter uses Bayes' Theorem to classify emails. Suppose an email contains the word "offer," and we want to determine the probability that this email is spam.

Statement of Bayes Theorem

Bayes' Theorem relates the conditional and marginal probabilities of random events. The formula is expressed as:

P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}

Where:

  • P(A∣B) is the posterior probability: the probability of event A occurring given that B is true.
  • P(B∣A) is the likelihood: the probability of event B occurring given that A is true.
  • P(A) is the prior probability: the initial probability of event A.
  • P(B) is the marginal likelihood: the total probability of event B occurring under all possible conditions.

Read More about Bayes Theorem.

Difference between Conditional Probability and Bayes Theorem

The common differences between both concepts Conditional Probability and Bayes Theorem are listed in the following table:

AspectConditional ProbabilityBayes' Theorem
DefinitionThe probability of an event occurring given that another event has already occurred.A formula that describes how to update the probabilities of hypotheses based on new evidence.
Formula

P(AB) = \frac{P(A \cap B)}{P(B)}

P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}

PurposeTo measure the likelihood of an event given that another event has occurred.To update the probability of a hypothesis when given new evidence.
ComponentsRequires the joint probability of both events and the probability of the given event.Requires the prior probability, the likelihood, and the marginal likelihood.
UsageUsed in situations where the outcome of an event is dependent on another event.Used in inferential statistics to revise probabilities and make decisions based on new data.
Application FieldsGeneral probability problems, risk assessment, game theory.Machine learning, medical diagnosis, finance, Bayesian statistics.

Read More,


Explore