0% found this document useful (0 votes)
20 views47 pages

CH 9 Inferrence in FOL

Chapter 9 discusses inference in First Order Logic (FOL), focusing on the application of inference rules for quantifiers, including Universal and Existential Instantiation. It covers methods like forward and backward chaining, resolution, and the significance of unification in FOL inference. The chapter also highlights the importance of definite clauses and the efficiency of algorithms in knowledge bases for logical reasoning.

Uploaded by

jsontineni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views47 pages

CH 9 Inferrence in FOL

Chapter 9 discusses inference in First Order Logic (FOL), focusing on the application of inference rules for quantifiers, including Universal and Existential Instantiation. It covers methods like forward and backward chaining, resolution, and the significance of unification in FOL inference. The chapter also highlights the importance of definite clauses and the efficiency of algorithms in knowledge bases for logical reasoning.

Uploaded by

jsontineni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 47

Chapter 9

Inference in First Order Logic


3.11 Propositional vs First-Order Inference
Inference is applying logical rules to the knowledge base to evaluate and analyze new information.

Inference rules for quantifiers

Quantifiers are used to express the quantity of something. There are two standard quantifiers,

1. Universal quantifiers, and 2. Existential quantifiers.

There are two Basic Ideas for Inference in FOL


1.Grounding 2. Lifted Inference

Universal Instantiation

 can be applied several times to add new sentences

 the new KB is logically equivalent to the old

Existential Instantiation

 can be applied once to replace the existential sentence


Universal instantiation (UI)

The rule of Universal Instantiation (UI for short) says that we can infer any sentence obtained
by substituting a ground term (a term without variables) for the variable .

Suppose our knowledge base contains the standard axiom stating that all greedy kings are evil:

∀ x King(x) ∧ Greedy(x) ⇒ Evil(x)

Then it seems quite permissible to infer any of the following sentences:

King(John) ∧ Greedy(John) ⇒ Evil(John)

King(Richard) ∧ Greedy(Richard) ⇒ Evil(Richard)

King(Father (John)) ∧ Greedy(Father (John)) ⇒ Evil(Father (John))


To write out the inference rule formally, we use the notion of substitutions.

Let SUBST(θ,α) denote the result of applying the substitution θ to the sentence α.

Notation: Subst({v/g}, α) means the result of substituting g for v in


sentence α

Every instantiation of a universally quantified sentence is entailed by it:

For example, the three sentences given earlier are obtained with the substitutions {x/John},
{x/Richard}, and {x/Father (John)}
Existential Instantiation (EI)
In the rule for Existential Instantiation, the variable is replaced by a single new constant symbol.

The formal statement is as follows: for any sentence α, variable v, and constant symbol k that does

not appear elsewhere in the knowledge base

For example, from the sentence ∃ x Crown(x) ∧ OnHead(x, John)

we can infer the sentence as Crown(C1) ∧ OnHead(C1, John)

as long as C1 does not appear elsewhere in the knowledge base.


Basically, the existential sentence says there is some object satisfying a condition, and applying the

existential instantiation rule just gives a name to that object(that name must not already belong to

another object).
Reduction to propositional inference
Once we have rules for inferring non-quantified sentences from quantified sentences, it becomes

possible to reduce first-order inference to propositional inference.

An existentially quantified sentence can be replaced by one instantiation, a universally

quantified sentence can be replaced by the set of all possible instantiations.

For example, suppose our knowledge base contains just the sentences

∀ x King(x) ∧ Greedy(x) ⇒ Evil(x)

King(John)

Greedy(John)

Brother (Richard, John) .


Then we apply UI to the first sentence using all possible ground-term substitutions
from the vocabulary of the knowledge base in this case, {x/John} and {x/Richard}.
We obtain

King(John) ∧ Greedy(John) ⇒ Evil(John)

King(Richard) ∧ Greedy(Richard) ⇒ Evil(Richard) ,

and we discard the universally quantified sentence.

Now, the knowledge base is essentially propositional if we view the ground atomic sentences—
King (John), Greedy(John), and so on—as proposition symbols.

By applying any one of complete propositional algorithms we can obtain conclusions such as
Evil(John).
3.12 Unification and lifting
Unification is the process of finding substitutions that make different logical expressions look
identical.

It is a key component of all first-order inference algorithms.

The UNIFY algorithm takes two sentences and returns a unifier for them if one exists:
Unify(α,β) = θ if αθ = βθ
Generalized Modus Ponens

Generalized modus ponens used with KB of definite clauses (exactly one positive literal)

All variables assumed universally quantified

Rule: King(x) ∧ Greedy(x) ⇒ Evil(x)

Precondition of rule: p1 is King(x) p2 is Greedy(x)

Implication: q is Evil(x)

Facts: p1 ′ is King(John) p2 ′ is Greedy(y)

Substitution: θ is {x/John, y/John} ⇒ Result of modus ponens: q θ is Evil(John)


Storage and retrieval

STORE(s) stores a sentence s into the knowledge base

FETCH(q) returns all unifiers such that the query q unifies with some sentence in the knowledge
base.

The simplest way to implement STORE and FETCH is to keep all the facts in one long list and unify
each query against every element of the list.

For example, there is no point in trying to unify Knows(John, x) with Brother (Richard, John).

We can avoid such unifications by indexing the facts in the knowledge base.

A simple scheme called predicate indexing puts all the “Knows” facts in one bucket and all the
“Brother” facts in another. The buckets can be stored in a hash table for efficient access.
 For example, suppose that the tax authorities want to keep track of who employs whom, using
predicate Employs(x, y).
 For the fact Employs(IBM , Richard), the queries are
 Employs(IBM , Richard) Does IBM employ Richard?
 Employs(x, Richard) Who employs Richard?
 Employs(IBM , y) Whom does IBM employ?
 Employs(x, y) Who employs whom?
 These queries form a subsumption lattice.
Horn clause and definite clause

Horn clause and definite clause are the forms of sentences, which enables knowledge base to use
a more restricted and efficient inference algorithm.

Logical inference algorithms use forward and backward chaining approaches, which require KB
in the form of the first-order definite clause.

Definite clause: A clause which is a disjunction of literals with exactly one positive literal is
known as a definite clause or strict horn clause.

Horn clause: A clause which is a disjunction of literals with at most one positive literal is
known as horn clause. Hence all the definite clauses are horn clauses.

Example: (¬ p V ¬ q V k). It has only one positive literal k. It is equivalent to p ∧ q → k.


3.13 Forward Chaining
Starting from the known facts, it triggers all the rules whose premises are satisfied, adding their
conclusions to the known facts.

The process repeats until the query is answered (assuming that just one answer is required) or no new facts
are added.

A fact is not “new” if it is just a renaming of a known fact.

 One sentence is a renaming of another if they are identical except for the names of the variables.

FOL-FC-ASK is easy to analyze.

First, it is sound, because every inference is just an application of Generalized Modus Ponens, which is
sound.
Second, it is complete for definite clause knowledge bases; that is, it answers every query whose
answers are entailed by any knowledge base of definite clauses.
Forward chaining algorithm
Example
The law says that it is a crime for an American to sell weapons to hostile(military enemy)
nations.

The country Nono, an enemy of America, has some missiles, and all of its missiles were
sold to it by Colonel West, who is American.

Prove that Col. West is a criminal.

Knowledge base Construction


. . . it is a crime for an American to sell weapons to hostile nations:
American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) Ô⇒ Criminal(x)
Nono . . . has some missiles, i.e.,
∃ x Owns(Nono, x) ∧ Missile(x): Owns(Nono,M1) and Missile(M1)
. . all of its missiles were sold to it by Colonel West
∀x Missile(x) ∧ Owns(Nono, x)⇒ Sells(West, x, Nono)
Missiles are weapons:
Missile(x) ⇒ Weapon(x)
An enemy of America counts as “hostile”:
Enemy(x, America) ⇒ Hostile(x)
West, who is American . . .
American(West)
The country Nono, an enemy of America . . .
Enemy(Nono, America)
Forward chaining proof
Forward chaining proof
Forward chaining proof
Properties of Forward chaining

It is a down-up approach, as it moves from bottom to top.

Sound and complete for first-order definite clauses.

For Datalog knowledge bases, which contain no function symbols, the proof of completeness is fairly
easy.
 Datalog = first-order definite clauses + no functions

FC terminates for Datalog in finite number of iterations

May not terminate in general if α is not entailed.

Forward chaining is known as data-driven inference technique as we reach to the goal using the
available data.

Forward -chaining approach is commonly used in the expert system, such as CLIPS, business, and
production rule systems.
3.13 Backward Chaining

This algorithm work backward from the goal, chaining through rules to find known facts that
support the proof.

In backward chaining, the goal is broken into sub-goal or sub-goals to prove the facts true.

FOL-BC-ASK(KB, goal) will be proved if the knowledge base contains a clause of the form lhs
⇒ goal, where lhs (left-hand side) is a list of conjuncts.

 Backward chaining (unlike forward chaining) suffers from problems with repeated states and
incompleteness.

Backward chaining is a kind of AND/OR search—the OR part because the goal query can be
proved by any rule in the knowledge base, and the AND part because all the conjuncts in the lhs
of a clause must be proved.
Backward chaining algorithm

SUBST(COMPOSE(θ1, θ2), p) = SUBST(θ2, SUBST(θ1, p))


Example
The law says that it is a crime for an American to sell weapons to hostile(military enemy)
nations.

The country Nono, an enemy of America, has some missiles, and all of its missiles were
sold to it by Colonel West, who is American.

Prove that Col. West is a criminal.

Knowledge base Construction


. . . it is a crime for an American to sell weapons to hostile nations:
American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) Ô⇒ Criminal(x)
Nono . . . has some missiles, i.e.,
∃ x Owns(Nono, x) ∧ Missile(x): Owns(Nono,M1) and Missile(M1)
. . all of its missiles were sold to it by Colonel West
∀x Missile(x) ∧ Owns(Nono, x)⇒ Sells(West, x, Nono)
Missiles are weapons:
Missile(x) ⇒ Weapon(x)
An enemy of America counts as “hostile”:
Enemy(x, America) ⇒ Hostile(x)
West, who is American . . .
American(West)
The country Nono, an enemy of America . . .
Enemy(Nono, America)
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Backward chaining example
Properties of backward chaining
It is known as a top-down approach.

Backward-chaining is based on modus ponens inference rule.

It is called a goal-driven approach, as a list of goals decides which rules are selected and used.

Incomplete due to infinite loops

 fix by checking current goal against every goal on stack.

Inefficient due to repeated sub-goals (both success and failure)

 fix using caching of previous results (extra space).


3.14 Resolution
Resolution is a theorem proving technique that proceeds by building refutation(proofs by
contradictions) .

Resolution is a single inference rule which can efficiently operate on the conjunctive
normal form or clausal form.

Clause: Disjunction of literals (an atomic sentence) is called a clause. It is also known as a
unit clause.

Conjunctive Normal Form: A sentence represented as a conjunction of clauses is said to


be conjunctive normal form or CNF.
The resolution inference rule

The resolution rule for first-order logic is simply a lifted version of the propositional rule.

Resolution can resolve two clauses if they contain complementary literals, which are assumed
to be standardized apart so that they share no variables.

Where li and mj are complementary literals.

This rule is also called the binary resolution rule because it only resolves exactly two literals.
Steps for Resolution

1. Conversion of facts into first-order logic.

2. Convert FOL statements into CNF ( Conjunctive Normal Form)

3. Negate the statement which needs to prove (proof by contradiction)

4. Draw resolution graph (unification).

5. If empty clause is produced , stop and report that original theorem is true.
Example

John likes all kind of food.

Apple and vegetable are food

Anything anyone eats and not killed is food.

Anil eats peanuts and still alive

Harry eats everything that Anil eats.

Prove by resolution that:John likes peanuts.


Step-1: Conversion of Facts into FOL
In the first step we will convert all the given statements into its first order logic.
Step-2: Conversion of FOL into CNF
In First order logic resolution, it is required to convert the FOL into CNF as CNF form makes easier for
resolution proofs.

i)Eliminate all implication (→) and rewrite


 ∀x ¬ food(x) V likes(John, x)

 food(Apple) Λ food(vegetables)

 ∀x ∀y ¬ [eats(x, y) Λ ¬ killed(x)] V food(y)

 eats (Anil, Peanuts) Λ alive(Anil)

 ∀x ¬ eats(Anil, x) V eats(Harry, x)

 ∀x¬ [¬ killed(x) ] V alive(x)

 ∀x ¬ alive(x) V ¬ killed(x)

 likes(John, Peanuts).
ii)Move negation (¬)inwards and rewrite iii)Rename variables or standardize
∀x ¬ food(x) V likes(John, x) variables
food(Apple) Λ food(vegetables)  ∀x ¬ food(x) V likes(John, x)

∀x ∀y ¬ eats(x, y) V killed(x) V food(y)  food(Apple) Λ food(vegetables)


 ∀y ∀z ¬ eats(y, z) V killed(y) V food(z)
eats (Anil, Peanuts) Λ alive(Anil)
 eats (Anil, Peanuts) Λ alive(Anil)
∀x ¬ eats(Anil, x) V eats(Harry, x)
 ∀w¬ eats(Anil, w) V eats(Harry, w)
∀x ¬killed(x) ] V alive(x)
 ∀g ¬killed(g) ] V alive(g)
∀x ¬ alive(x) V ¬ killed(x)
 ∀k ¬ alive(k) V ¬ killed(k)
likes(John, Peanuts).  likes(John, Peanuts).
iv) Eliminate existential instantiation quantifier

In this step, we will eliminate existential quantifier ∃, and this process is known as Skolemization. (But

in this example problem since there is no existential quantifier so all the statements will remain same in
this step.)

v) Drop Universal quantifiers.

In this step we will drop all universal quantifier (since all the statements are not implicitly quantified so

we don't need it.)


 ¬ food(x) V likes(John, x)  alive(Anil)
 food(Apple)  ¬ eats(Anil, w) V eats(Harry, w)
 food(vegetables)  killed(g) V alive(g)
 ¬ eats(y, z) V killed(y) V food(z)  ¬ alive(k) V ¬ killed(k)
 eats (Anil, Peanuts)  likes(John, Peanuts).
Step-3: Negate the statement to be proved
In this statement, we will apply negation to the conclusion statements, which will be written as
¬likes(John, Peanuts)

Step-4: Draw Resolution graph


Now in this step, we will solve the problem by

resolution tree using substitution.

Step-5: If empty clause is produced ,

stop and report


Hence proved
Example 2
The law says that it is a crime for an American to sell weapons to hostile(military enemy)
nations.

The country Nono, an enemy of America, has some missiles, and all of its missiles were
sold to it by Colonel West, who is American.

Prove that Col. West is a criminal.

Step 1:
. . . it is a crime for an American to sell weapons to hostile nations:
American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) ⇒ Criminal(x)
Nono . . . has some missiles, i.e.,
∃ x Owns(Nono, x) ∧ Missile(x): Owns(Nono,M1) and Missile(M1)
. . all of its missiles were sold to it by Colonel West
∀x Missile(x) ∧ Owns(Nono, x)⇒ Sells(West, x, Nono)
Missiles are weapons:
Missile(x) ⇒ Weapon(x)
An enemy of America counts as “hostile”:
Enemy(x, America) ⇒ Hostile(x)
West, who is American . . .
American(West)
The country Nono, an enemy of America . . .
Enemy(Nono, America)
Step 2:

Every sentence of first-order logic can be converted into an inferentially equivalent CNF

¬American(x) ∨ ¬Weapon(y) ∨ ¬Sells(x, y, z) ∨ ¬Hostile(z) ∨ Criminal(x)

¬Missile(x) ∨ ¬Owns(Nono, x) ∨ Sells(West, x, Nono)

¬Enemy(x, America) ∨ Hostile(x)

¬Missile(x) ∨ Weapon(x)

 Owns(Nono, M1)

Missile(M1)

American(West)

 Enemy(Nono, America)

Step 3
¬ Criminal(x)
Step 4,5

You might also like