Semantics and logic
1- logic and language
2- propositional logic
3- Logical words
4- The logic of BUT
5- Inference from AND
6- The logic of negators
7- Intention and extention
8- Truth conditional semantics
9- Truth conditions and linguistics
10- Entailment
11- Presupposition
Palmer, Lobner, Saeed, Mohammed Alkhuli.
Logic and language
The terms logic and logical are often used simply to mean 'reasonable' or
'sensible'. But there is a stricter sense of the terms to refer to formal logical
systems which have much in common with mathematical systems, and
which deal with the validity of inferences.
A favourite example from traditional logic textbooks is:
All men are mortal.
Socrates is a man.
Therefore Socrates is mortal.
Here the CONCLUSION (the third sentence) follows from the PREMISES
(the first two sentences). The INFERENCE is LOGICALLY valid. Notice,
however, that this would not be true of:
1
All men are mortal.
Socrates is a mortal.
Therefore Socrates is a man.
A moment's reflection will show that here there is a false conclusion, for
Socrates might be the name of my cat. We often reach conclusions along
such lines without actually stating all the premises. For instance, we might
conclude that Maurice is fabulously rich because we know that he is a
pop-star; here the premises are All pop-stars are fabulously rich and
Maurice is a pop-star. We cannot, however, conclude that Maurice is a
pop-star because we know that he is fabulously rich, using the same
premises. That conclusion would require the premise All fabulously rich
people are pop stars, which is untrue. Yet such false conclusions are often
arrived at, especially in areas such as politics. Similarly, we may reason as
follows:
John is either at home or in his office.
John is not at home.
Therefore Fohn is in his office.
If the premises are true, the conclusion follows: it also is true.
To capture the properties of sentences which make valid conclusions such
as those we have been considering, logicians are concerned with the
LOGICAL FORM of such sentences, and this can be shown (as we have
already seen) by the use of a formal language, using specialised symbols
whose status is exactly the same as those of mathematics. Thus, the
example about John being at home can be symbolised ((p v q) &~p) 9,
which is no more mysterious than symbolising the statement that if you add
two apples to four and remove half you will have three by (4 + 2) 2 = 3.
We shall see in the following pages how logic can deal with problems and
ambiguities that grammar fails to solve.
2
For instance, I'm looking for a pencil is ambiguous as shown by the
continuations and when I find it.. . or and when I find one . . . Again,
linguists have found difficulty with the fact that Everybody in this room
speaks two languages seems to have a different meaning from Two
languages are spoken by everybody in this room (Katz & Postal 1964: 72).
A very familiar and very old joke is one that can be played with no one,
nobody, etc., as when Odysseus in Homer's Odyssey told the giant
Polyphemus that his name was 'No one' and Polyphemus' friends would
not come to his aid when he said no one had hurt him.
Typically the joke appears also in Lewis Carroll: 'Who did you pass on the
road?. . . 'Nobody’ . . . ‘Quite right, this young lady saw him too. So of
course Nobody walks slower than you. All these can be, and have been,
explained in simple logical terms.
Propositional logic
PROPOSITIONAL LOGIC is used here (and widely elsewhere), alternative
names are PROPOSITIONAL CALCULUS and SENTENTIAL CALCULUs.
Propositional logic is the branch of logic which deals with relations between
propositions. A proposition is something which serves as the premise or
conclusion of an argument.
We are here concerned with the relations that hold between sentences,
especially relations involving complex sentences, irrespective of the
internal structure of the sentences themselves.
Thus, in an example, we have the two sentences John is in his office and
John is at home and the information that (at least) one of these is true.
Given that the second is false, we can conclude that the first is true.
3
This conclusion can be drawn irrespective of the form of the sentences
themselves. Thus from Either John is in his office or whales are fishes, we
can draw the conclusion John is in his office if whales are fishes is false.
We also need symbols for the LOGICAL CONNECTIVES:
It becomes apparent from this that we need RULES OF FORMATION to
form these more compound sentences, and that the logical connectives do
not hold only between the simple sentences p, q, etc., but also between the
compound sentences. This is all part of the LOGICAL SYNTAX.
We also need to define the connectives (and this is dealt with under
LOGICAL SEMANTICs). It will be remembered that we were concerned
with the truth or falsity of our simple sentences, and, indeed, with the truth
and falsity of compound sentences in relation to the truth and falsity of the
simple sentences of which they are formed.
It is assumed here that every sentence is either true or false - that it can be
assigned the TRUTH VALUE 'true' or ‘false', symbolised t and f.
Given the truth value of the simple sentences, we can deduce the truth
value of any compound sentence provided we know the 'meaning' of the
connectives.
It now becomes possible to set up TRUTH TABLES for each of the
connectives. These indicate what is the truth value of the compound
sentence in relation to the truth value of the simple sentence from which it
is formed. For & ('and'), for instance, the compound sentences will be true
only if the simple sentences are true.
For the logician, however, conjunction involves the truth values shown in
the table. For negation the truth table is simply:
4
Here we need only one sentence p, together with its negation p. This, of
course, says that, if a sentence is not true, it is false, and vice versa. But
again in ordinary language we do not always accept this. If asked whether
it is raining or not, we might well say It's doing neither the one thing nor the
other.
There are also problems with antonyms and complementaries. It will be
remembered, for instance, that John isn't honest would usually mean that
John is dishonest, whereas John isn't clever would not suggest that he is
stupid.
In ordinary language either. . .or. . . usually means that only one of the
sentences is true. Thus we might argue:
John is either at home or in his office.
John is at home.
Therefore he is not in his office.
This involveS EXCLUSIVE or, but logical disjunction is concerned with
INCLUSIVE or, which allows not only that either sentence may be true, but
also that both sentences may be true.
Although implication is related to if.. . then, there is one striking difference.
In ordinary language we normally relate sentences with if. . . then only if
there is some causal (or similar) relationship between them. But this would
not be permissible in propositional logic, because it takes no account of the
nature of the sentences themselves. As with conjunction and disjunction,
we need a connective that will simply relate (ANY) sentences in terms of
their truth values.
5
The essential point, however, is clear. Implication in the logical sense, like
any of the connectives, does not necessarily correspond exactly to the use
of anything found in natural language. It owes its validity solely to the truth
functions assigned to it. (This is, however, MATERIAL IMPLICATION. For
STRICT IMPLICATION see 8.6.)
Logical words
Semantics deals with the word meaning and sentence meaning, whereas
logic deals with reasoning principles. Of course, reasoning principles
depend heavily on meaning. Thus semantics and logic are strongly related.
Logical words
In every language there are words or expressions that cannot be RE's or
PE's. Words like London, John, and Hani can be RE's. Words like student,
man, and honest can be PE's which can be used to inform about RE's, e.g.,
Hani is an honest man. However, words like and, or, but, if, all, some, and
not cannot be RE's or PE's . They are called linking words or logical words.
The logic of BUT
Most languages, if not all, have but as a logical word linking two S's into a
compound sentence ( CS), e.g., John has left (A), but Edward has arrived
(B) For this CS, there are four probabilities concerning truth and falsity:
1. If A is true and B is true, then the CS is true.
6
2. If A is true and B is false, then the CS is false.
3. If A is false and B is true, then the CS is false.
4. If A is false and B is false, then the CS is false.
This shows that for A but B to be true, both A and B must be true.
If either A or B is false, then A but B is false.
This makes the truth probabilities of but identical with those of and.
A and B is true only if both A and B are true. Similarly, A but B is true only if
both A and B are true.
The falsity of A or B makes A & B false and also makes A but B false.
Table 9 - 3 summarizes the truth probabilities of but, where T stands for
true and F for false.
7
Inference from AND
Look at these S's:
1. Hani passed the test (A)
2. Ali passed the test. (B)
3. Hani passed the test and Ali passed the test.
4. Hani and Ali passed the test.
We can use and to combine S1 and S2 into S3. By omitting common
words, we can condense S3 into S4.
If S1 is true and S2 is true, S4 is necessarily true, so is S3. Thus, S4
requires two true premises.
Thus, we have four rules related to and:
1. If A is true and B is true, A & B is true
2. If A & B is true, B & A is true.
3. If A & B is true, A is true.
8
4. If A & B is true, B is true.
Truth Probabilities of And
When and is used to combine two S's like A and B, there are four truth
probabilities:
1. Both A and B are true.
2. A is true, and B is false.
3. A is false, and B is true.
4. Both A and B are false.
If both A and B are true, then the compound sentence (CS) is true. If A is
true and B is false, then the CS is false. If A is false and B is true, then the
CS is false. If both A and B are false, the CS is false.
Table 9- 1 shows these four probabilities as such:
9
The logic of negators
All languages have negation, negative sentences, and negators, i.e.,
particles that negate, e.g., not, never, no. Negators are considered logical
words; so are and, or, and but. In logic, negation is symbolized as ~.
Look at these S's:
1. John swam yesterday.
2. ~ (John swam yesterday).
3. John has left. and Ali has arrived.
4. ~ (John has left) and ~ ( Ali has arrived).
5. A
6. ~A
7. A & B
8. ~A &~B
If we examine the previous eight sentences, we notice that S2 is the
negation of S1 and S4 is the negation of S3.
Notice that S4 needs two negators because it has two combined
statements. In addition, S6 is the negation of S5, and S8 is the negation of
S7.
Truth Probabilities of Negators
If A is true, the negation of A is false. Moreover, if A is false, its negation is
true.
If A is true, the negation of its negation will be true. If A is false, the
negation of its negation will be false.
If we use symbols, we have these four probabilities:
10
In other words, if we negate a true S, the output will be false. If we negate a
false S, the output will be true. If we negate the negative of a true S, the
output will be true. If we negate the negative of a false S, the output will be
false.
Intension and extension
We may say the extension of an expression is the set of entities which that
expression denotes, while its intension is whatever it is that defines that
set. Thus, the extension of cow is the set of all the cows in the world, but its
intension is the property that is described as bovine.
Knowing the meaning of an expression, however, cannot be equivalent to
knowing its extension, for this would mean that we could not know the
meaning of cow if we did not know all the cows in the world.
This is why a 'naming' approach to meaning is bound to be unsuccessful.
Failure to make the distinction can lead to paradoxes. It is at the centre of
the problem concerning the morning star and the evening star. How could it
ever have been that people did not know that the morning star was the
evening star? For this would seem to suggest that they did not know that
Venus was Venus.
11
The point, of course, is that the extension of these two expressions is the
same (Venus), but their intensions are different (though, in fact, the
description is inaccurate since Venus is a planet, not a star).
Without knowing the correct extensions of the expressions, it was perfectly
possible for people not to know that the morning star and the evening star
were the same. Similarly, Carnap (1948: 24) pointed out that featherless
biped and rational animal have the same extensions (human beings), but
clearly different intensions.
Logicians are not, however, usually concerned with the actual world but
with POSSIBLE WORLDS.
A possible world must not be thought of as some other inhabited planet
existing in some other galaxy or as a creation of a science fiction writer, but
rather as a state of affairs which may be different from the state of affairs,
the world, that we experience (or it may not be different, since ‘our' world is,
in this sense, one possible world).
There are two reasons for doing this.
The first is that the logician, and indeed the linguist, is not primarily
concerned with practical questions such as 'What does this word/sentence
mean?', but with theoretical questions about what it is for a word or
sentence to have a meaning. It is not directly relevant, therefore, what the
world is actually like.
More importantly, perhaps, we often talk about different ‘worlds', in the
sense that we envisage that things could be different from what they are.
This is clear enough in conditional sentences and wishes such as If I had
lived in Egypt, I would have spoken Arabic and I wish I spoke Arabic.
The world of belief, too, is different from the actual world.
Possible worlds are also involved in epistemic and deontic modality (6.8),
and, indeed, logicians have used the term MODAL LOGIC to refer to
analysis in terms of possible worlds, though clearly this is to use the term
modal in a much wider sense.
12
There are some ambiguities and problems that can be dealt with in terms of
extensions and intensions and of the notion of possible worlds (the
literature on the subject is very large, but is not altogether clear or
consistent).
We may note, to begin with, the distinction made by Donnellan (1966
[1971: 102ff.]) between the REFERENTIAL and the ATTRIBUTIVE uses of
referential expressions.
He points out that Smith's murderer is insane is ambiguous, since it may
mean either that a certain person, e.g. Jones, who is known to have
murdered Smith, is insane, or that the person who murdered Smith,
whoever he may be (and it may not be known who he is), is insane.
In the first case (the referential use), the expression Smith's murderer is
being used to identify someone and is thus concerned with extension in the
real world;
in the second case (the attributive use), we are more concerned with the
description itself, with the intension of the expression and with its extension
only in possible worlds.
Secondly, we may notice the ambiguity in the sentence Mary believes that
the President is handsome. Although one man, e.g. Mr Smith, may, in fact,
be President, Mary may believe that someone else, e.g. Mr Brown, is the
President; the sentence may thus be taken to mean either that she believes
that Mr Smith is handsome or that she believes that Mr Brown is
handsome.
This kind of ambiguity is usually handled in terms of a de re and a de dicto
interpretation ('about the thing' and 'about what is said'.
Closely associated with this is what have been called OPAQUE contexts
(cf. Quine 1960: 141ff.). Consider a situation in which Professor Green is
the Dean. Then it will not necessarily follow that, if John believes that
Professor Green is a genius is true, John believes that the Dean is a genius
is also true, even though The Dean is tall will be true if Professor Green is
13
tall is true. For obviously, if John does not know that Professor Green is the
Dean, he may believe that Professor Green is a genius, without necessarily
believing that the Dean is a genius.
An opaque context is defined as one in which truth is not preserved when
certain types of co-referential expressions are substituted for one another.
Thus, since Professor Green and the Dean are the same person, Professor
Green and the Dean are co-referential expressions, but they cannot be
substituted in the first pair of sentences with truth preservation.
Truth conditional semantics
is an approach to semantics of natural language that sees meaning (or at
least the meaning of assertions) as being the same as, or reducible to, their
truth conditions.
This approach to semantics is principally associated with Donald
Davidson, and attempts to carry out for the semantics of natural language
what Tarski's semantic theory of truth achieves for the semantics of logic.
Truth-conditional theories of semantics attempt to define the meaning of a
given proposition by explaining when the sentence is true. So, for example,
because 'snow is white' is true 'if and only if’ snow is white, the meaning of
'snow is white' is snow is white.
The first truth-conditional semantics was developed by Donald Davidson in
Truth and Meaning (1967). It applied Tarski's semantic theory of truth to a
problem it was not intended to solve, that of giving the meaning of a
sentence.
Scott Soames has harshly criticized truth-conditional semantics on the
grounds that it is either wrong or uselessly circular. Under its traditional
formulation, truth-conditional semantics gives every necessary truth
precisely the same meaning, for all of them are true under precisely the
same conditions (namely, all of them). And since the truth conditions of any
unnecessarily true sentence are equivalent to the conjunction of those truth
14
conditions and any necessary truth, any sentence means the same as its
meaning plus a necessary truth. For example, if "snow is white" is true if
snow is white, then it is trivially the case that "snow is white" is true iff snow
is white and 2+2=4, therefore under truth conditional semantics "snow is
white" means both that snow is white and that 2+2-4.
Truth-conditional semantics is based on the notion that the core meaning of
any sentence (any statement) is its truth conditions. Any speaker of the
language knows these conditions. If a sentence is true (or false), what
other sentences, expressing partly the same, partly different conditions, can
be judged by this sentence? If a given sentence is true, does this make
another sentence also true, or does it falsify the other sentence, or is there
no truth relation? Matters of truth and logic are of more importance in
truth-conditional semantics than meanings of lexemes per se.
A fundamental fact about declarative sentences is that they are either true
or false (and since we use language to communicate information about the
world, a listener will in general assume that a sentence they have just
heard is true, and uses that fact to enrich their knowledge of the world).
Thus (1) is true and (2) is false:
(1) Barack Obama moved into the White House on Jan. 20, 2009.
(2) John McCain moved into the White House on Jan. 20, 2009.
The study of truth or truth conditions in semantics falls into two basic
categories:
the study of different types of truth embodied in individual sentences
(analytic, contradictory, and synthetic) and the study of different types of
truth relations that hold between sentences (entailment and
presupposition).
Truth conditions and linguistics
15
Truth conditional semantics has been proposed as a way of dealing with
some of the linguistic issues.
It will be recalled that a truth-conditional account of pre supposition has
been suggested (7.4, 8.2). It may also be possible to deal with implicatures
(7.5) in the same way.
Smith & Wilson (1979: 149–50, 172–3) consider the situation in which it is
known that either Barbara Cartland or Patrick White is going to win the
Literature Prize;
It won't be Patrick White will then convey the information that Barbara
Cartland will win it.
Now it is obvious that, if we think of the shared knowledge as the
proposition Either Barbara Cartland or Patrick White will win the prize and
the utterance as Patrick White will not win the prize we can draw the logical
conclusion Barbara Cartland will win the prize.
Not all implicatures will be as simple as this. For instance, the reply to
Where's my box of chocolates? maybe I was feeling hungry, from which it
can be inferred that they have all gone; this involves the shared knowledge
about what people do when they are hungry, why questions are not
answered directly, etc.
In principle both the meaning of the reply and all the shared knowledge
could be stated in propositional terms and, if so, the conclusion could be
drawn in a logical way.
This would seem to be a reasonable interpretation of the suggestion that
we need only the maxim of relation to account for implicatures and that
relevance may be defined in terms of ‘A remark P is relevant to another
remark Q, if P and Q. together with background knowledge, yield new
information not derivable from either P or Q. together with background
knowledge, alone' (Smith & Wilson 1979: 177). "Remarks', 'background
knowledge' and 'new information' can all, in theory, be treated as
propositions or sets of propositions.
16
There are, however, difficulties with such analysis, involving the status of
the 'background knowledge'.
For it is by no means certain that this can be stated in propositional terms,
and, if it cannot, it is difficult to see how new information can be derived.
Thus to say that I know that it is raining is not the same thing as to say that
the proposition It is raining is part of my knowledge. For my dog may know
that it is raining, but it seems unreasonable to suggest that he entertains
propositions. Secondly, this knowledge is private and not directly accessible
to the observer, and there is therefore a grave danger that if the
investigator is not himself the speaker he may be involved in circular
reasoning.
Entailment
17
18
19
20
Presupposition
Presupposition is something the speaker assumes or supposes to be 115
true before making an utterance' It refers to what a speaker or a writer
assumes that the receiver of the message already knows. If somebody
asks you a question like: When did you give up smoking?, there are two
presuppositions involved. Firstly, the speaker presupposes that you used to
smoke. Secondly, he knows that you no longer smoke¹¹6. Another example
is seen in this dialogue between A and B.
Presupposition is the relationship between two propositions.
21
22
23
24
25
26