0% found this document useful (0 votes)
19 views23 pages

Unit Iv

The document discusses probabilistic reasoning in artificial intelligence, emphasizing the importance of handling uncertainty in knowledge representation. It covers concepts such as probability, conditional probability, Bayes' theorem, and Bayesian networks, which are crucial for making predictions and decisions under uncertainty. Additionally, it highlights the applications of these concepts in various fields, including AI and weather forecasting.

Uploaded by

sachinnagane68
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views23 pages

Unit Iv

The document discusses probabilistic reasoning in artificial intelligence, emphasizing the importance of handling uncertainty in knowledge representation. It covers concepts such as probability, conditional probability, Bayes' theorem, and Bayesian networks, which are crucial for making predictions and decisions under uncertainty. Additionally, it highlights the applications of these concepts in various fields, including AI and weather forecasting.

Uploaded by

sachinnagane68
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Chandigarh Group of Colleges, Landran

Unit
3
Probabilistic
reasoning in Artificial
intelligence
Chandigarh Group of Colleges, Landran

Uncertainty
• Suppose A and B are two statements, If we implement if-then rule
to these statements, we might write A→B, which means if A is true
then B is true, or if A is false then B is false, if A is true then B is
false, if A is false then B is true.
• But consider a situation where we are not sure about whether A is
true or not then we cannot express this statement, this situation is
called uncertainty.
• So to represent uncertain knowledge, where we are not sure about
the predicates, we need uncertain reasoning or probabilistic
reasoning.
Chandigarh Group of Colleges, Landran

Causes of uncertainty
1. Information occurred from unreliable sources.
2. Experimental Errors
3. Equipment fault
4. Temperature variation
5. Climate change.
Chandigarh Group of Colleges, Landran

Probabilistic reasoning
• Probabilistic reasoning is a way of knowledge representation where we apply the
concept of probability to indicate the uncertainty in knowledge. In
probabilistic reasoning, we combine probability theory with logic to handle the
uncertainty.
• In the real world, there are lots of scenarios, where the certainty of something is not
confirmed, such as "It will rain today," "behavior of someone for some
situations," "A match between two teams or two players." These are
probable sentences for which we can assume that it will happen but not
sure about it, so here we use probabilistic reasoning.
Chandigarh Group of Colleges, Landran

Need of probabilistic reasoning in I


A
• When there are unpredictable outcomes.
• When specifications or possibilities of predicates becomes too large to
handle.
• When an unknown error occurs during an experiment.

In probabilistic reasoning, there are two ways to solve problems with


uncertain knowledge:
• Bayes' rule
• Bayesian Statistics
Chandigarh Group of Colleges, Landran

Probability
• Probability can be defined as a chance that an uncertain event will occur. It is the
numerical measure of the likelihood that an event will occur. The value of
probability always remains between 0 and 1 that represent ideal uncertainties.
1. 0 ≤ P(A) ≤ 1, where P(A) is the probability of an event A.
2. P(A) = 0, indicates total uncertainty in an event A.
3. P(A) =1, indicates total certainty in an event A.
Chandigarh Group of Colleges, Landran

Probability cont.
We can find the probability of an uncertain event by using the below formula.

•P(¬A) = probability of a not happening event.


•P(¬A) + P(A) = 1.
Event: Each possible outcome of a variable is called an event.
Sample space: The collection of all possible events is called sample space.
Random variables: Random variables are used to represent the events and objects in the
real world.
Prior probability: The prior probability of an event is probability computed before
observing new information.
Posterior Probability: The probability that is calculated after all evidence or information
has taken into account. It is a combination of prior probability and new information.
Chandigarh Group of Colleges, Landran

Conditional probability
• Conditional probability is a probability of occurring an event when another event has already
happened.
• Let's suppose, we want to calculate the event A when event B has already occurred,
"the probability of A under the conditions of B", it can be written as:

Where P(A⋀B)= Joint probability of A and


B P(B)= Marginal probability of B.
• If the probability of A is given and we need
to find the probability of B, then it will be
given as:
Example:
If
•P(A) = 0.5 (probability of A)
•P(B) = 0.4 (probability of B)
•and both can occur together with P(A ⋀ B) = 0.2
Then:
•P(A ⋀ B) = 0.2 → meaning there’s a 20% chance that both A and B happen.
Relation formula:

If A and B are independent:


P(A ⋀ B) = P(A) × P(B)

If they are not independent, then you use conditional probability:


P(A ⋀ B) = P(A) × P(B|A)
Chandigarh Group of Colleges, Landran

Venn Diagram
Chandigarh Group of Colleges, Landran

Venn Diagram Example


• Example:
• In a class, there are 70% of the students who like English and 40% of the students
who likes English and mathematics, and then what is the percent of
students those who like English also like mathematics?
• Solution:
• Let, A is an event that a student likes Mathematics
• B is an event that a student likes English

• Hence, 57% are the students who like English also like Mathematics.
Chandigarh Group of Colleges, Landran

Bayes' theorem
• Bayes' theorem is also known as Bayes' rule, Bayes' law, or Bayesian reasoning,
which determines the probability of an event with uncertain knowledge.
• In probability theory, it relates the conditional probability and marginal probabilities
of two random events.
• Bayes' theorem was named after the British mathematician Thomas Bayes.
The Bayesian inference is an application of Bayes' theorem, which is
fundamental to Bayesian statistics.
• It is a way to calculate the value of P(B|A) with the knowledge of P(A|B).
• Bayes' theorem allows updating the probability prediction of an event by
observing new information of the real world.
Chandigarh Group of Colleges, Landran

Example
• If cancer corresponds to one's age then by using Bayes' theorem, we can
determine the probability of cancer more accurately with the help of
age.
• Bayes' theorem can be derived using product rule and conditional
probability of
event A with known event B:
• As from product rule we can write:
P(A ⋀ B)= P(A|B) P(B) or
• Similarly, the probability of event B with known event
A: P(A ⋀ B)= P(B|A) P(A)
• Equating right hand side of both the equations, we will
get:
Chandigarh Group of Colleges, Landran

Example cont.
• The above equation (a) is called as Bayes' rule or Bayes' theorem. This equation is basic of
most
modern AI systems for probabilistic inference.
• It shows the simple relationship between joint and conditional probabilities. Here,
• P(A|B) is known as posterior, which we need to calculate, and it will be read as Probability of
hypothesis A when we have occurred an evidence B.
• P(B|A) is called the likelihood, in which we consider that hypothesis is true, then we calculate
the probability of evidence.
• P(A) is called the prior probability, probability of hypothesis before considering the evidence
• P(B) is called marginal probability, pure probability of an evidence.
• In the equation (a), in general, we can write P (B) = P(A)*P(B|Ai), hence the Bayes' rule can
be written as

• Where A1, A2, A3,........, An is a set of mutually exclusive and exhaustive


events.
Chandigarh Group of Colleges, Landran

Applying Bayes' rule


• Bayes' rule allows us to compute the single term P(B|A) in terms of P(A|B), P(B),
and P(A). This is very useful in cases where we have a good probability of
these three terms and want to determine the fourth one. Suppose we want to
perceive the effect of some unknown cause, and want to compute that
cause, then the Bayes' rule becomes:
Chandigarh Group of Colleges, Landran

Example-1:
Question: what is the probability that a patient has diseases meningitis with a stiff
neck?
Given Data:
– A doctor is aware that disease meningitis causes a patient to have a stiff neck, and it occurs
80% of the time. He is also aware of some more facts, which are given as follows:
– The Known probability that a patient has meningitis disease is 1/30,000.
– The Known probability that a patient has a stiff neck is 2%.
• Let a be the proposition that patient has stiff neck and b be the proposition that patient
has meningitis. , so we can calculate the following as:
– P(a|b) = 0.8
– P(b) = 1/30000
– P(a)= .02

• Hence, we can assume that 1 patient out of 750 patients has meningitis disease with a
stiff neck.
Chandigarh Group of Colleges, Landran

Example-2:
Question: From a standard deck of playing cards, a single card is drawn. The
probability that the card is king is 4/52, then calculate posterior probability
P(King|Face), which means the drawn face card is a king card.
Solution:
Given:

Total cards in a deck = 52


Kings in deck = 4
Face cards = Jack, Queen, King of each suit
→ So total face cards = 3 × 4 = 12
Let
A = card is a King
B = card is a Face card
Chandigarh Group of Colleges, Landran

P(A∣B)=P(A∧B)​/P(B)

• P(king)=P(A∣B): probability that the card is King= 4/52= 1/13


• P(face)=P(B): probability that a card is a face card= 3/13
• P(Face|King): probability of face card when we assume it is a king = 1
• Putting all values in equation (i) we will get:
Chandigarh Group of Colleges, Landran

Application of Bayes' theorem in Artificial


intelligence
Following are some applications of Bayes' theorem:
• It is used to calculate the next step of the robot when the already executed step
is given.
• Bayes' theorem is helpful in weather forecasting.
• It can solve the Monty Hall problem.
Chandigarh Group of Colleges, Landran

Bayesian Belief Network in artificial intelligence


• Bayesian belief network is key computer technology for dealing with probabilistic events
and to solve a problem which has uncertainty. We can define a Bayesian network as:
• "A Bayesian network is a probabilistic graphical model which represents a set of variables
and their conditional dependencies using a directed acyclic graph."
• It is also called a Bayes network, belief network, decision network, or Bayesian model.
• Bayesian networks are probabilistic, because these networks are built from a probability
distribution, and also use probability theory for prediction and anomaly detection.
• It can also be used in various tasks including prediction, anomaly detection, diagnostics,
automated insight, reasoning, time series prediction, and decision making under
uncertainty.
Chandigarh Group of Colleges, Landran

Bayesian Network cont.


• Bayesian Network can be used for building models from data and experts opinions, and it
consists of two parts:
• Directed Acyclic Graph
• Table of conditional probabilities.
• The generalized form of Bayesian network that represents and solve decision problems
under uncertain knowledge is known as an Influence diagram.
• A Bayesian network graph is made up of nodes and Arcs (directed links), where:
Chandigarh Group of Colleges, Landran

Bayesian Network cont.


• Each node corresponds to the random variables, and a variable
can
be continuous or discrete.
• Arc or directed arrows
represent the
causal relationship
or conditional
probabilities
between random variables.
These directed links or
arrows connect the pair of
nodes in the
graph.
These links represent that one node directly influence the other node, and if there is no
directed link that means that nodes are independent with each other
Chandigarh Group of Colleges, Landran

Bayesian Network Components


• The Bayesian network has mainly two components:
• Causal Component
• Actual numbers
• Each node in the Bayesian network has condition probability
distribution P(Xi |Parent(Xi) ), which determines the effect of the parent on
that node.
• Bayesian network is based on Joint probability distribution and conditional
probability. So let's first understand the joint probability distribution:

You might also like