0% found this document useful (0 votes)
34 views

AI - Module 3

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views

AI - Module 3

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 76

Formalized

Symbolic Logics
Introduction
● First Order Predicate Logic (FOPI) or Predicate Calculus, has assumed one of the most important
roles in Al for the representation of knowledge.
● In FOPL, statements from a natural language like English are translated into symbolic structures
comprised of predicates, functions, variables, constants, quantifiers, and logical connectives.
● The symbols form the basic building blocks for the knowledge, and their combination into valid
structures is accomplished using the syntax (rules of combination) for FOPL.
● Once structures have been created to represent basic facts or procedures or other types of
knowledge, inference rules may then be applied to compare, combine and transform these
"assumed" structures into new 'deduced" structures.
● This is how automated reasoning or inferencing is performed.
● Propositional Logic is the study of statements and their connectivity. Predicate Logic is the study
of individuals and their properties.
2
As a simple example of the use of. logic, the statement
'All employees of the Al-Software Company are, programmers" might be
written in FOPL as
SYNTAX AND SEMANTICS FOR

PROPOSITIONAL LOGIC
Valid statements or sentences in PL are determined according to the rules of propositional
syntax.
● This syntax governs the combination of basic building blocks such as propositions and logical
connectives.
● Propositions are elementary atomic sentences.
● Propositions may be either true or false but may take on no other value.
● Some examples of simple propositions are
● It is raining.
● My car is painted silver.
● John and Sue have five children.
● Snow is white.
● People live on the moon.
4
● Compound propositions are formed from atomic formulas using the logical connectives
not and or if . . . then, and if and only if.
● For example, the fo1Iowing are compound formulas.
● It is raining and the wind is blowing.
● The moon is made of green cheese or it is not.
● If you study hard you will be rewarded.
● The sum of 10 and 20 is not 50.
● We will use capital letters sometimes followed by digits, to stand for propositions; T and
F are special symbols having the values true and false, respectively. The following
symbols will also be used for logical connectives
● - for not or negation
● & for and or conjunction

5
● V for or or disjunction
● -> for if ... then or implication
● <---> for if and only if or double implication
● In addition, left and right parentheses, left and right braces, and the period will be used as
delimiters for punctuation.
● So, for example. to represent the compound sentence ''It is raining and the wind is blowing"
we could write (R & B) where R and B stand for the propositions ''It is raining" and "the wind
is blowing," respectively.
● If we write (R V 8) we mean ''it is raining or the wind is blowing or both'' that is. V indicates
inclusive disjunction.

6
Syntax

7
8
Semantics
● The semantics or meaning of a sentence is just the value true or false: that is. it is an
assignment of a truth value to the sentence.
● An interpretation for a sentence or group of sentences is an assignment of a truth value
to each propositional symbol.
● As an example, consider the statement P & ~Q.
● One interpretation I1 assign" true to P and false to Q.
● A different interpretation (I2.) assigns true to P and true to Q. Clearly. there are four
distinct interpretations for this sentence

● ----------

9
Properties of Statements
● Satisfiable. A statement is satisfiable if there is some interpretation for which it is true.
● Contradiction. A sentence is contradictory (unsatisfiable) if there is no interpretation for
which it is true.
● Valid. A sentence is valid if it is true for every interpretation. Valid sentences are also
called tautologies.
● Equivalence. Two sentences are equivalent if they have the same truth value under every
interpretation
● Logical consequences. A sentence is a logical consequence of another if it is satisfied by
all interpretations which satisfy the first. More generally, it is a logical consequence of
other Statements if and only if for any interpretation in which the statements are true, the
resulting statement is also true.

10
11
12
Inference Rules
● Modus ponens ● Chain Rule
● From P and P->Q infer Q. This is ● From P->Q, and Q->R.,infer P->R
written as ● Or
● P ● P->Q
● P->Q ● Q->R
● Q
● For example: ● P->R
● Given:(Joe is a father) ● For example
● And:(Joe is a father)->(joe has a child) ● Given: (Programmer likes LISP) -> (Programmer hates
● Conclude:(Joe has a child) COBOL)
● And:( Programmer hates COBOL) -> Programmer likes
recursion.
● Conclusion :( Programmer likes LISP->Programmer likes
recursion)
13
● Formal system
● A formal system is a set of axioms S
● Substitution
and a set of inference rules L from
● If s is a valid sentence, s’ derived from
which new statements can be logically
s by consistent substitution of
derived. We can define a formal system
propositions in s, is also valid.
as <S,L>
● Conjunction:
● Soundness:
● From P and from Q, infer P&Q.
● Let <S, L> be a formal system. The
● Transposition:
inference procedure L is sound if and
● From P->Q, infer ~Q->~P
only if any statements s that can be
derived from <S,L> is alogical sequence
of <S,L>.
● Completeness:
● Let <S, L> be a formal system. The
inference procedure L is complete if
and only if any statements s loically
implied by <S,L> can be derived using
that procedure.
14
INFERENCE RULES
⚫ Like PL, a key inference rule in FOPL is modus
ponens.
⚫ From the assertion 'Leo is a lion" and the
implication 'all lions are ferocious" we can
conclude that Leo is ferocious.
⚫ Written in symbolic form we have
⚫ assertion: LION(leo)
⚫ implication: ∀x LION(x) -> FEROCIOUS(r)
⚫ conclusion: FEROCIOVS(leo)
⚫ In general, if a has property P and all objects
that have property P also have property Q, we
conclude that a has property Q.
⚫ P( a)
⚫ ∀x P(x) -> Q(x)
⚫ Q(a)
⚫ Substitutions are an essential part of the
inference process.
⚫ When properly applied, they permit
simplifications or the reduction of
expressions through the cancellation of
complementary literals.
⚫ We say that two literals are complementary
if they are identical but of opposite sign:
that is. P and ~P are complementary.
⚫ A substitution is defined as a set of pairs:, ti
and vi where vi, are distinct variables and ti
are terms not containing the vi. The ti
replace or are substituted for the
corresponding vi, in any expression for
which the substitution is applied. A set of
substitutions {t1 /v1, t2,/v2,.....tn/vn}.
⚫ For example, if β = {a/x, g(b)/y}, then
applying β to the clause C = P(x,y) V
Q(x,f(y)) we obtain C' = C β = P(a,g(b)) V
Q(af(g(b))).
Unification
⚫ Any substitution that makes two or more
expressions equal is called a unifier for the
expressions.
⚫ Applying a substitution to an expression E
produces an instance E' of E where E' = E β.
Given two expressions that are unifiable,
such as expressions C1 and C2 with a
unifier β with C1β=C2, we say that β is a
most general unifer.
THE RESOLUTION PRINCIPLE
⚫ We are now ready to consider the resolution
principle, a syntactic inference procedure
which, when applied to a set of clauses,
determines if the set is unsatisfiable.
⚫ This procedure is similar to the process of
obtaining a proof by contradiction. For
example, suppose we have the set of clauses
(axioms) C l , C2,C3, . . . , Cn and we wish to
deduce or prove the clause D, that is, to show
that D is a logical consequence of C 1 & C & . .
. & Cn,. First, we negate D and add ~D to the
set of clauses Cl, C2, . . . . Cn.
⚫ Then, using resolution together with factoring,
we can show that the set is unsatisfiable by
deducing a contradiction.
⚫ Resolution is very simple.Given two clauses
C1 and C2, with no variables in common, if
there is a literal L1 in C I which is a
complement of a literal L2 in C2. both L1
and L2are deleted and a disjuncted C is
formed from the remaining reduced clauses.
⚫ The new clause C is called the resolvent of
RV
C1 and C2, R
⚫ Resolution is the process of generating
these resolvents from a set of clauses. For
example. to resolve the two clauses
⚫ ( ~P V Q) and (~Q VR)
⚫ we write ~PVQ V ~QVR
⚫ ~P V R
⚫ Several types of resolution are possible
⚫ Binary Resolution:
⚫ Two clauses having complementary literals are
combined as disjuncts to produce a single clause
after deleting the complementary literals.
⚫ For example. the binary resolvent of
⚫ ~P(x,a) VQ(x) and ~Q(b) V R(x)
⚫ is just
⚫ ~P(b,a) V R(b).
⚫ The substitution {b/x} was made in the two
parent clauses to produce the complementary
literals Q(b) and ~Q(b) which were then deleted
from the disjunction of the two parent clauses.
Unit resulting resolution.
⚫ A number of clauses are resolved simultaneously
to produce a unit clause. All except one of the
clauses are unit clauses, and that one clause has
exactly one more literal than the total number of
unit clauses. For example, resolving the set
⚫ {~MARRIED(x,y) V ~MOTHER(x,z)
VFATHER(y.z), MARRIED(sue,joe),
~FATHER(joe,bilI)} where the substitution β =
{sue/x, joe/y bill/z) is used, results in the Unit
clause ~MOTHER(sue,bill).
Linear resolution
⚫ When each resolved clause Ci, is a parent
to the clause Ci+I (i = I. 2..... n - I) the
process is called linear resolution.
⚫ For example, given a set S of clauses with
C0⊆ S, Cn, is derived by a sequence of
resolutions, C0with some clause B0 to get
C1 , then C 1 with some clause B 1 to get
C2 , and so on until Cn. has been derived.
Linear Input resolution
⚫ If one of the parents in linear resolution is
always from the original set of clauses (the
Bi ), we have linear input resolution. For
example, given the set of clauses S = {PV
Q, ~P V Q. P V~Q, ~P V ~Q} let C0 = (P
V Q). Choosing B0 = ~P V Q from the set
S and resolving this with C0 we obtain the
resolvent Q = C 1 .
⚫ B must now be chosen from S and the
resolvent of C1 and B 1 becomes C2, and
soon.
⚫ When attempting a proof by resolution, one
ideally would like a minimally unsatisfiable set
of clauses .
⚫ A minimally unsatisfiable set is one which is
satisfiable when any member of the set is
omitted. The reason for this choice is that
irrelevant clauses which are not needed in the
proof but which participte are unnecessary
resolutions. They contribute nothing toward
the proof.
⚫ Indeed, they can sidetrack the search direction
resulting in a dead end and loss of resources.
Of course, the set must be unsatisfiable
otherwise a proof is impossible.
⚫ A minimally unsatisfiable set is ideal in the
sense that all clauses are essential and no
others are needed.
⚫ Thus, if we wish to prove B, we would like to
do so with a set of clauses S = {A 1,
A,2,.....Ak} which become minimally
unsatistiable with the addition of ~B.
⚫ Choosing the order in which clauses are
resolved is known as a search strategy.
⚫ This strategy separates a set which is
unsatisfiable into subsets, one of which is
satisfiable.
Set-of-support strategy
⚫ Let S be an unsatisfiable set of clauses and T
be a subset of S. Then T is a set-of-support
for S if S - T is satisfiable. A set-of-support
resolution is a resolution of two clauses not
both from S - T.
Quantifier: ∀, ∃
Predicates: Brother, Father, >
Connectives: ∧, ∨, ¬, ⇒, ⇔
NONDEDUCT1VE INFERENCE
METHODS
INTRODUCTION…
In this section we consider three nondeductive forms of
inferencing.
These are not valid forms of inferencing, but they are
nevertheless very important.
We use all three methods often in every day activities
where we draw conclusions and make decisions.
The three methods we consider here are abduction,
induction, and analogical inference.
ABDUCTIVE INFERENCE
Abductive inference is based on the use of known causal
knowledge to explain or justify a (possibly invalid) conclusion.
Given the truth of proposition Q and the implication P -> Q,
conclude P.
You have a cough, a fever of 101 degrees Fahrenheit, a runny
nose, chills, an aching body, nausea and diarrhea. You have had
these symptoms for five days. Given this information, your best
guess is that you have influenza, or the flu. But you are not
completely certain. This is an example of abductive reasoning.
We may represent abductive inference with the following,
where the c over the implication arrow is meant to imply a
possible causal relationship. assertion Q
implication P^c ->Q
conclusion P
.Abductive inference is useful when known causal relations
are likely and deductive inferencing is not possible for lack of
facts.
INDUCTIVE INFERENCE
ANALOGICAL INFERENCE
Analogical reasoning is a kind of reasoning that is based
on finding a common relational system between two
situations.
When such a common system can be found, then what is
known about one situation can be used to infer new
information about the other.
The basic intuition behind analogical reasoning is that
when there are substantial similarities between situations,
there are likely to be further similarities.
PROPERTIES OF WFFS
⚫ As in the case of PL, the evaluation of complex formulas in
FOPL can often be facilitated through the substitution of
equivalent formulas. In the table F, G and H denote wffs not
containing variables and F (x) denotes the wff F which
contains the variable x..
⚫ A wff is said to be valid if it is true under every
interpretation.
⚫ A wff that is false under every interpretation is said to be
inconsistent (Or unsatisfiable).
⚫ A wff that is not valid (one that is false for some
interpretation) is invalid.
⚫ we say that a wff Q is a logical consequence of the wffs P1
, P2 ,..., Pn if and only if whenever P1 & P2 & …. & Pn, is
true under an interpretation, Q is also true.
⚫ To illustrate some of these concepts, consider the following
examples:
⚫ a. P & ~P is inconsistent and P V~ P is valid since the first is
false under every interpretation and the second is true under
every interpretation.
⚫ b. From the two wffs CLEVER(bill) and
∀xCLEVER(x) ->SUCCEED(x) we can show that
SUCCEED(bill) is a logical consequence. Thus, assume that
both
⚫ Suppose the wff F[ x] contains the variable x.
⚫ We say x is bound if it follows or is within the scope of a
quantifier naming the variable.
⚫ If a variable is not bound, it is said to be free. For example,
in the expression Vx (P(x) -> Q(x, y )). . is bound, but y is
free since every occurrence of x follows the quantifier and y
is not within the scope of any quantifier.
⚫ Clearly, an expression can be evaluated only when all the
variables in that expression are bound.
⚫ Therefore, we shall require that all wffs contain only bound
variables
⚫ Given wffs F1, F2,….Fn each possibly consisting of the
disjunction of literals only, we say F 1 & F2 & …& Fn, is in
conjunctive normal form (CNF).
⚫ F1 V F 2V….Fn is in disjunctive normal form (DNF
Nonmonotonic
Reasoning
• The logics we studied in the previous classes are known as
monotonic logics.
• The conclusions derived using such logics are valid deductions,
and they remain so.
• Adding new axioms increases the amount of knowledge
contained in the knowledge base.
• Therefore, the set of facts and inferences in such systems can
only grow larger; they can not be reduced; that is, they increase
monotonically.
• In this and the, we shall discuss methods with which to
accurately represent and deal with different forms of
inconsistency, uncertainty, possibility, and beliefs.
• In other words, we shall be interested in representations and
inference methods related to what is known as commonsense
reasoning.
• When building knowledge-based systems, it is not resonable to
expect that all the knowledge needed for a Set of tasks could be
acquired, validated, and loaded into the system at the outset.
• More typically, the initial knowledge will be incomplete, contain
redundancies, inconsistencies, and other sources of uncertainty.
• Even if it were possible to assemble complete, valid knowledge
initially, it probably would not remain valid forever, not in a
continually changing environment.
• In an attempt to model real-world, commonsense reasoning,
researchers have proposed extensions and alternatives to
traditional logics such as PL and FOPL. The extensions
accommodate different forms of, uncertainty and nonmonotony.
TRUTH MAINTENANCE
SYSTEMS
• Truth maintenance systems (also known as belief revision and
revision maintenance systems) are companion components to
inference systems.
• The main job of the TMS is to maintain consistency of the
knowledge being used by the problem solver.
• As such, it frees the problem solver from any concerns of
consistency and allows it to concentrate on the problem solution
aspects.
• The TMS also gives the inference component the latitude to
perform nonmonotonic inferences.
• When new discoveries are made, this more recent information can
displace previous conclusions that are no longer valid.
• In this way, the set of beliefs available to the problem solver will
continue to be current and consistent.
• The inference engine (IE) solves domain problems based on its
current belief set, while the TMS maintains the currently active
belief set.
• The updating process is incremental. After each inference,
information is exchanged between the two components.
• The IE tells the TMS what deductions it has made.
• The TMS, in turn, asks questions about current beliefs and reasons
for failures.
• It maintains a consistent set of beliefs for the IE to work with even
if new knowledge is added and removed.
• Actually, the TMS does not discard conclusions like Q as suggested.
• That could be wasteful, since P may again become valid; which
would require that Q and facts justified by Q be rederived.
• Instead, the TMS maintains dependency records for all such
conclusions. These records determine which set of beliefs are
current (which are to be used by the IE).
• Thus, Q would be removed from the current belief set1 by making
appropriate updates to the records and not by erasing Q .
• Since Q would not be lost, its rederivation would not be necessary
if P became valid once again.
• The TMS maintains complete records of reasons or justifications
for beliefs. Each proposition or statement having at least one valid
justification is made a part of the current belief set.
• Statements lacking acceptable justifications are excluded from this
set. When a contradiction is discovered, the statements
responsible for the contradiction are identified and an appropriate
one is retracted. This in turn may result in other retractions and
additions. The procedure used to perform this process is called
dependency-directed backtracking.
• The TMS maintains records to reflect retractions and additions so
that the IE will always know its current belief set.
• The records are maintained in the form of a dependency network.
• The nodes in the network represent KB entries such as premises,
conclusions, inference rules, and the like.
• Attached to the nodes are justifications which represent the
inference steps from which the node was derived.
• Nodes in the belief set must have valid justifications.
• A premise is a fundamental belief which is assumed to be always
true. Premises need no justifications.
• They form a base from which all other currently active nodes can
be explained in terms of valid justifications.
• There are two types of justification records maintained for nodes:
support lists (SL) and conceptual dependencies (CP). SLs are the
most common type. They provide the supporting justifications for
nodes. The data structure used for the SL contains two lists of
other dependent node names, an in-list and an out-list. It has the
form
• (SL <in-list><out-list>)
DEFAULT REASONING AND
THE CLOSED WORLD
ASSUMPTION
Introduction…….
▣ Another form of uncertainty occurs as a result of
incomplete knowledge.
▣ One way humans deal with this problem is by making
plausible default assumptions; that is, we make
assumptions which typically hold but may have to be
retracted if new information is obtained to the contrary
Default Reasoning
▣ Default reasoning is another form of nonmonotonic
reasoning; it eliminates the need to explicitly store
all facts regarding a situation.
▣ Reiter (1980) develops a theory of default‘
reasoning within the context of traditional logics. A
default is expressed as
▣ where a(x) is a precondition wff for the conclusion
wff c(x), M is a consistency operator and the bi(x) are
conditions, each of which must be separately
consistent with the KB for the conclusion c(x) to
hold.
▣ As an example, suppose we wish to make the
statement, "If x is an adult and it is consistent to
assume that x can drive, then infer that x can drive."
Using the above formula this would be represented as
Closed World Assumption
▣ Another form of assumption, made with regard to
incomplete knowledge, is more global in nature than
single defaults.
▣ This type of assumption is useful in applications where
most of the facts are known, and it is, therefore,
reasonable to assume that if a proposition cannot be
proven, it is false. This is known as the closed world
assumption (CWA)
▣ Example: Airline KB
▣ By augmenting a KB with an assumption which states
that if the ground atom P(a) cannot be proved, assume
its negation ~P(a), the CWA completes the theory with
respect to KB.
▣ (Recall that a formal KSYStem is complete if and
only if every ground atom or its negation is in the
system.)
▣ For example, a KB containing only the clauses

ASSOCIATIVE NETWORKS
Network representations provide a means of structuring and
exhibiting the structure in knowledge.
In a network, pieces of knowledge are clustered together into
coherent semantic groups.
Network representations give a pictorial presentation of objects,
their attributes and the relationships that exist between them and
other entities.
Associative networks are directed graphs with labeled nodes and
arcs or arrows.
The language used in constructing a network is based on selected
domain primitives for objects and relations as well as some
general primitives
A FRAGMENT OF A SIMPLE NETWORK
SYNTAX AND SEMANTICS OF ASSOCIATIVE
NETWORKS
Unlike FOPL, there is no generally accepted syntax nor
semantics for associative networks.
Such rules tend to be designer dependent and vary greatly from
one implementation to another.
Most network systems are based on PL or FOPL with
extensions, however.
The syntax for any given system is determined by the object
and relation primitives chosen and by any special rules used to
connect nodes.
Basically, the language of associative networks is formed from
letters of the alphabet, both upper- and lowercase', relational
symbols, set membership and subset symbols, decimal digits,
square and oval nodes, and directed arcs of arbitrary length.
The word symbols used are those which represent object
constants and relation constants.
Nodes are commonly used for objects or nouns, and arcs (or arc
nodes) for relations.
The direction of an arc is usually taken from the first to
subsequent arguments as they appear in a relational statement.
Thus, OWNS(bobs,house) would be written as
A number of arc relations have become common among users.
They include such predicates as ISA, MEMBER-OF,
SUBSET-OF, AKO (a-kind-of). HASPARTS, INSTANCE-OF,
AGENT. ATTRIBUTES, SHAPED-LIKE, and so forth.
Less common arcs have also been used to express modality
relations (time, manner. mood), linguistics case relations (theme,
source, goal), logical connectives (Or. not, and, implies).
quantifiers (all, some), set relations (superset, subset, member),
attributes, and quantification (ordinal, count).
GENERIC-GENERIC RELATIONSHIPS

Subset-Superset (fighting ships-battleships)


Generalization-Specialization (restaurant-fast-foods)
AKO (an elephant is a kind of mammal)
Conceptual containment (a triangle is a polygon)
Sets and their type (an elephant and a set of elephants)
Role value restrictions (an elephant trunk is a cylinder 1.3
meters in length)
GENERIC-INDIVIDUAL RELATIONSHIPS

Set membership (Clyde is a came!)


Predication (predicate application to individual as in
BROWN(cainel))
Conceptual containment (king and the King of England)
Abstraction (the ''eagle" in "the eagle is an endangered
species'')
Associative networks are good at representing.
⚫ First, it should be apparent that networks clearly show an entity's
attributes and its relationships to other entities. This makes It easy to
retrieve the properties an entity shares with other entities. For this, it is
only necessary to check direct links tied to that entity.
⚫ Second, networks can be constructed to exhibit any hierarchical or
taxonomic structure inherent in a group of entities or concepts
Associative network structures permit the implementation of
property inheritance, a form of inference. Nodes which are
members or subsets of other nodes may inherit properties front
higher level ancestor nodes.

it is possible to infer that a mouse has hair and drinks milk.


CONCEPTUAL GRAPHS
A conceptual graph is a graphical portrayal of a mental
perception which consists of basic or primitive concepts and the
relationships that exist between the concepts.
A single conceptual graph is roughly equivalent to a graphical
diagram of a natural language sentence where the words are
depicted as concepts and relationships.
Conceptual graphs may be regarded as formal building blocks
for associative networks which, when linked together in a
coherent way, form a more complex knowledge structure.
An example of such a graph which represents the sentence "Joe
is eating soup with a spoon" is depicted in
• Concepts are enclosed in boxes and relations
between the concepts are enclosed in ovals.
• The direction of the arrow corresponds to the order
of the arguments in the relation they connect.
• The last or nth arc (argument) points away from the
circle relation and all other arcs point toward the
relation.
Concept symbols refer to entities, actions, properties, or events
in the world.
A concept may be individual or generic. Individual concepts
have a type field followed by a referrent field.
The concept [ PERSON:joe] has type PERSON and referrent
Joe.
Referrents like joe and food in last Figure are called individual
concepts since they refer to specific entities.
EAT and SPOON have no referrent fields since they are generic
concepts which refer to unspecified entities.
Concepts like AGENT, OBJECT, INSTRUMENT, and PART
are obtained from a collection of standard concepts.
Inference can be accomplished by modifying and combining
graphs through the use of operators and basic graph inference
rules.
Four useful graph formation operators are copy , restrict, join,
and simplify. These operators are defined as follows.
Copy. Produces a duplicate copy of a CG
Restrict.. Modifies a graph by replacing a type label of a
concept with a subtype or a specialization from generic to
individual by inserting a referent ot the same concept type.
Join. Combines two identical graphs C, and C by attaching all
relation arcs from C, to C, and then erasing C2.
Simplify. Eliminates one of two identical relations in a
conceptual graph when all connecting arcs are also the same.

You might also like