Conditional Probability Density Function (Conditional PDF) describes the probability distribution of a random variable given that another variable is known to have a specific value. In other words, it provides the likelihood of outcomes for one variable, conditional on the value of another.
Mathematically, for two continuous random variables X and Y, the conditional PDF of X given that Y = y is denoted as:
f_{X|Y}(x|y) = \frac{f_{X,Y}(x,y)}{f_Y(y)}
Where:
- fX|Y(x|y) is the joint probability density function of X and Y.
- fY(y) is the marginal probability density function of Y, which is the probability distribution of Y alone.
Here,
- Marginal PDF: f_Y(y) = \int_{-\infty}^{\infty} f_{X,Y}(x, y) dx, which represents the probability distribution of Y regardless of X.
- Conditional PDF: fX∣Y(x∣y) tells us how X is distributed when we know Y is y.
How to Calculate Conditional PDF?
To calculate the Conditional Probability Density Function (Conditional PDF), we use the relationship between the joint PDF and the marginal PDF and the following steps:
- Step 1: Find the joint PDF fX,Y(x,y). This represents the likelihood of both X and Y occurring simultaneously.
- Step 2: Find the marginal PDF fY(y) by integrating the joint PDF over x: f_Y(y) = \int_{-\infty}^{\infty} f_{X,Y}(x, y) \, dx
- Step 3: Calculate the conditional PDF using the formula.
This gives the probability distribution of X given the value of Y=y.
Let’s assume that X and Y have the following joint PDF:
f_{X,Y}(x, y) = 6xy for 0 < x < 1 and 0 < y < 1
Step 1: Find the marginal PDF of Y:
f_Y(y) = \int_0^1 6xy \, dx = 6y \int_0^1 x \, dx = 6y \left[\frac{x^2}{2}\right]_0^1 = 3y
Step 2: Calculate the conditional PDF of X given Y=y:
f_{X|Y}(x|y) = \frac{f_{X,Y}(x, y)}{f_Y(y)} = \frac{6xy}{3y} = 2x \quad \text{for} \quad 0 < x < 1
Thus, the conditional PDF of X given Y = y is:
f_{X|Y}(x|y) = 2x, \quad 0 < x < 1.
This is how you calculate the conditional PDF.
Properties of Conditional PDF
Conditional Probability Density Function (Conditional PDF) has several important properties, which are useful in understanding how conditional distributions behave in probability theory and statistics. Here are the key properties:
Non-Negativity
The conditional PDF must always be non-negative:
f_{X|Y}(x|y) \geq 0 \quad \text{for all} \quad x, y.
This follows from the fact that probability density functions cannot be negative.
Normalization
The conditional PDF must integrate to 1 with respect to x, given a specific value of y. In other words:
\int_{-\infty}^{\infty} f_{X|Y}(x|y) \, dx = 1 for each fixed y
This ensures that the conditional probability of X given Y = y is a valid probability distribution.
Conditional Expectation
The conditional expectation of X given Y = y can be computed as:
\mathbb{E}[X | Y = y] = \int_{-\infty}^{\infty} x f_{X|Y}(x|y) \, dx
This is the expected value of X when Y is known to be y.
Conditional Independence
Two random variables X and Y are conditionally independent given a third random variable Z if:
f_{X,Y|Z}(x, y | z) = f_{X|Z}(x|z) f_{Y|Z}(y|z)
In other words, knowing Z makes X and Y independent. This property is fundamental in areas like graphical models and Bayesian networks.
Marginalization of Conditional PDF
To obtain the marginal PDF of X, you can integrate out the conditional PDF over the values of Y:
f_X(x) = \int_{-\infty}^{\infty} f_{X|Y}(x|y) f_Y(y) \, dy
This shows how the marginal PDF of X can be recovered from the conditional PDF and the marginal PDF of Y.
Conditional CDF
The conditional cumulative distribution function (CDF) of X given Y = y is related to the conditional PDF by:
F_{X|Y}(x|y) = \int_{-\infty}^{x} f_{X|Y}(t|y) \, dt.
This gives the probability that X is less than or equal to X, given that Y = y
Read More,
Similar Reads
Conditional Probability
Conditional probability defines the probability of an event occurring based on a given condition or prior knowledge of another event. Conditional probability is the likelihood of an event occurring, given that another event has already occurred. In probability this is denoted as A given B, expressed
11 min read
Conditional Convergence
Conditional convergence is convergence with a condition that is a series is said to be conditionally convergent if it converges, but not absolutely. This means that while the series âanâ converges, the series of the absolute values ââ£an⣠diverges.A classic example of a conditionally convergent serie
5 min read
Conditionally Convergent Series
A conditionally convergent series is a concept in mathematical analysis that describes a particular type of convergent series. In this article, we will learn about the definition of series, convergence in series, related examples and others in detail.Table of ContentWhat is a Series?Convergence in S
5 min read
Conditional Probability vs Bayes Theorem
Conditional probability and Bayes' Theorem are two imprtant concepts in probability where Bayes theorem is generalized version of conditional probability. Conditional probability is the probability of an event occurring given that another event has already occurred. Bayes' Theorem, named after the 1
5 min read
Absolute and Conditional Convergence
In mathematics, understanding the behavior of infinite series is crucial, and two important concepts in this context are absolute convergence and conditional convergence.An infinite series is said to be absolutely convergent if the series formed by taking the absolute values of its terms also conver
6 min read
Conditional Probability Practice Question
Conditional probability is a measure of the probability of an event occurring given that another event has already occurred. In simple words, conditional probability is like figuring out the chances of something happening given that something else has already happened. The probability of A given tha
7 min read
Bayes's Theorem for Conditional Probability
Bayes's Theorem for Conditional Probability: Bayes's Theorem is a fundamental result in probability theory that describes how to update the probabilities of hypotheses when given evidence. Named after the Reverend Thomas Bayes, this theorem is crucial in various fields, including engineering, statis
9 min read
Probability: Joint vs. Marginal vs. Conditional
Probability is a fundamental concept in statistics that helps us understand the likelihood of different events occurring. Within probability theory, there are three key types of probabilities: joint, marginal, and conditional probabilities.Marginal Probability refers to the probability of a single e
6 min read
Complementary Events: Definition, Rule, and Examples
Complementary events are fundamental concepts in probability theory that provide insights into the relationship between different outcomes of an experiment or event. In probability, events are not isolated occurrences but often have complementary counterparts that represent the opposite or negation
6 min read
Introduction to Mathematical Logic
Mathematical logic deals with the logic in mathematics. Mathematical logic operators and laws define various statements in their mathematical form. In this article, we will explore mathematical logic along with the mathematical logic operators and types of mathematical logic. We will also solve some
5 min read