Algebraic Logic
Algebraic Logic
4 Applying Algebra 29
4.1 Making Connections . . . . . . . . . . . . . . . . . . . . . . . 30
4.2 Soundness, Completeness, and Compactness . . . . . . . . . . 36
5 Conclusion 39
1
Preface
Students of mathematics learn early on the rules of logic by which we can
write rigorous proofs. It is quite arguable that the axioms and rules of in-
ference of a particular mathematical theory are the most important parts,
as they constitute the foundation of the theory. As mathematicians are
wont to do, we ask ourselves how these concepts generalize – that is, given
an axiomatic system and rules of inference, aside from particular theorems
that can be proven within the system, what can we say about the system
itself? The question is metamathematical, and it is quite an important
exploration. Previous work in the 20th century, such as Gödel’s Incom-
pleteness Theorems, tells us that there are certain limitations on different
axiomatic systems. In particular those theorems tell us that there is no
complete axiomatic system describing simple arithmetic of natural num-
bers, which means there will be statements that are not provably true or
false. However, as we will show in this paper, the propositional calculus is
complete.
The final results of this paper are theorems about the propositional cal-
culus, and while they are important results, they can be proven directly from
the definitions given in Section 1 without much difficulty. It might take an
undergraduate level class all of two weeks to build up enough theory to prove
these results. However, the importance of this paper does not lie in the the-
orems themselves, but in their proofs. After introducing Boolean algebra in
Section 2 and building algebraic machinery in Section 3, the common the-
orems of propositional calculus in Section 4 can be proven by viewing the
propositional calculus algebraically. The connections are very enlightening,
and quite elegant as well.
(A note on methodology: at the beginning of each section, I describe
from which source the definitions and theorems may be found. For the most
part, the proofs are my own work, as many of the theorems are stated as
problems in [1]. The few proofs that rely on arguments from my sources are
cited accordingly. The theorems in Section 4, as already stated, are common
in the study of mathematical logic. The algebraic proofs however, are solely
my own. I worked backwards to find the algebraic results needed, and wrote
Section 3 accordingly.)
2
1 Introduction to Propositional Calculus
The propositional calculus is very formally the study of particular strings of
letters, made up of a certain alphabet. To build up such a study, there are
copious necessary definitions. The definitions used in this first section are
from [2]. For those who are familiar with these concepts, the first section
may be skipped or perhaps skimmed, to become familiar with the notation
used throughout this paper. After Section 1, we assume a basic knowledge
of set theoretic notions.
1.1 Definitions
Let L denote the propositional language, which consists of an alphabet, from
which we can construct sentences. The alphabet has six letters, namely
¬, ⇒, p, +, (, ).
(x ∨ y) = ((¬x) ⇒ y)
3
and
(x ∧ y) = (¬(x ⇒ (¬y))).
These symbols are pronounced “or” and “and” respectively. From here we
also introduce the connective symbol ⇔ pronounced “if and only if” and
defined as
(x ⇔ y) = ((x ⇒ y) ∧ (y ⇒ x)).
Of course these new connective symbols could have been included in the for-
mal alphabet, but many formal definitions, such as those of sentences and
deductions, are much more concise when only needing to deal with two for-
mal connectives. Lastly, as with many areas in mathematics, we will allow
the omission of parentheses in scenarios that lead to no ambiguities. One
example is the outermost parentheses of (x ⇒ y), which are not necessary in
describing the sentence itself. As with typical conventions, we allow prece-
dence of negation over all other operations. Also among binary operations,
∨ and ∧ take precedence over ⇒ and ⇔. For example in the definition of ∧
above, we have
x ∧ y = ¬(x ⇒ ¬y).
1.2 Deductions
There is a certain type of sentence of L called a theorem. We will describe
these in full shortly, but first we describe a particular kind of theorem called
an axiom. If x, y, and z are sentences, then each of the sentences
A1 x∨x⇒x
A2 x⇒x∨y
A3 x∨y ⇒y∨x
A4 (x ⇒ y) ⇒ (z ∨ x ⇒ z ∨ y)
4
result of applying modus ponens to some pair of preceding sentences in the
sequence. The finite sequence just described is a deduction of y.
Example 1.2.2. The following is a deduction of the theorem x ⇒ x as
in the previous definition, where the justifications in parentheses are not
actually part of the sequence.
x⇒x∨x (A2)
x∨x⇒x (A1)
(x ∨ x ⇒ x) ⇒ ((x ⇒ x ∨ x) ⇒ (x ⇒ x)) (A4)
(x ⇒ x ∨ x) ⇒ (x ⇒ x) (MP 2,3)
x⇒x (MP 1,4)
5
If these conditions are met, then v is a truth valuation.
We can think of these two numbers as truth values, where 0 is false and
1 is true.
Definition 1.3.2. If x is a sentence such that v(x) = 1 for all possible truth
valuations v, then we call x a tautology and write x.
Definition 1.3.3. If y is a sentence such that v(y) = 0 for all possible truth
valuations v, then we call y a contradiction.
It is immediate from the definition of truth valuation that if y is a con-
tradiction, then ¬y is a tautology. Hence, in this case we write ¬y.
Definition 1.3.4. A set of sentences S of L is satisfiable if there is some
truth valuation v such that v(x) = 1 for all x ∈ S. In this case we say v
satisfies S.
Definition 1.3.5. Let y be a sentence and S be a set of sentences of L. If
for all truth valuations v satisfying S, that is for all v such that v[S] = {1},
it is also the case that v(y) = 1, then we say that y is a semantic consequence
of S and write S y.
We have now set the stage completely for propositional logic. Of course,
what we would like is for the semantic and syntactic concepts to coincide;
this is the whole point of defining such a language, to put our intuitive logical
reasoning (semantic) into a consistent, purely mechanical form (syntactic).
We will consider these questions through algebraic means later on, but first
we need to make note of a few properties of the propositional calculus.
There is really nothing deep in formal deductions, and for the purpose
of this paper they are more cumbersome than enlightening. Thus we omit
them here and refer the skeptics to [2].
6
2 Introduction to Boolean Algebra
The following section will deal with Boolean algebra in a very general setting,
with definitions, examples, and propositions from [1, Ch. 4 §1-2]. It is a
lengthy excursion from the propositional calculus, but we will see many
familiar notions from Section 1 disguised in algebraic notation.
As it turns out, Boolean algebra was first defined with sets as elements, so
the following example has historical significance.
Example 2.1.2. Let Y be a set. We have the power set under inclusion
hP(Y ), ⊆i as a lattice, where
7
Proof. B1 is rather obvious when we consider
x ∨ y = sup{x, y} = sup{y, x} = y ∨ x,
and similarly
x ∧ y = inf{x, y} = inf{y, x} = y ∧ x.
For B2 we have
x ≤ x ∨ (y ∨ z)
and
(y ∨ z) ≤ x ∨ (y ∨ z),
from which we also have
y ≤ x ∨ (y ∨ z) and z ≤ x ∨ (y ∨ z).
(x ∨ y) ≤ x ∨ (y ∨ z).
(x ∨ y) ∨ z ≤ x ∨ (y ∨ z).
y ≤ y ∨ (x ∧ y).
We also have
y≤y and (x ∧ y) ≤ y,
which together imply that y ∨ (x ∧ y) ≤ y. Hence by antisymmetry we
conclude (x ∧ y) ∨ y = y. For the counterpart of B3, we have immediately
that
(x ∨ y) ∧ y ≤ y,
and of course
y≤y and y ≤ (x ∨ y).
The two previous inequalities imply that
y ≤ (x ∨ y) ∧ y,
8
Definition 2.1.4. A lattice L is distributive if for all x, y, z ∈ L,
x ∧ (y ∨ z) = (x ∧ y) ∨ (x ∧ z),
x ∨ (y ∧ z) = (x ∨ y) ∧ (x ∨ z).
Curiously, each of the two distributive properties above implies the
other [1]. To show this, suppose the first condition holds. Then
(x ∨ y) ∧ (x ∨ z) = [(x ∨ y) ∧ x] ∨ [(x ∨ y) ∧ z]
= [(x ∨ y) ∧ x] ∨ [(x ∧ z) ∨ (y ∧ z)]
= x ∨ [(x ∧ z) ∨ (y ∧ z)]
= [x ∨ (x ∧ z)] ∨ (y ∧ z)
= x ∨ (y ∧ z).
The first three equalities follow from the first distributive property and the
last two follow from Proposition 2.1.3. The converse is proved similarly.
Definition 2.1.5. A lattice L is complemented if there exist both least and
greatest elements in L, denoted 0 and 1 respectively, and if for each x ∈ L
there exists y ∈ L such that
x∨y =1 and x ∧ y = 0.
x ∨ y = 1, x ∨ y 0 = 1,
x ∧ y = 0, x ∧ y 0 = 0,
9
and
y = y ∨ 0 = y ∨ (x ∧ y 0 ) = (y ∨ x) ∧ (y ∨ y 0 )
= 1 ∧ (y ∨ y 0 )
= y ∨ y0,
so that y 0 ≤ y. Identically,
y 0 = y 0 ∨ 0 = y 0 ∨ (x ∧ y) = (y 0 ∨ x) ∧ (y 0 ∨ y)
= 1 ∧ (y 0 ∨ y)
= y 0 ∨ y,
10
Proof. First we must show that hB, ≤i is a lattice; to show that ≤ as defined
above is actually an order relation, first consider reflexivity:
x = x ∧ (x ∨ x∗ ) by B3
= (x ∧ x) ∨ (x ∧ x∗ ) by B4
= (x ∧ x) ∨ 0 by B5
= (x ∧ x) ∨ [(x ∧ x) ∧ (x ∧ x)∗ ] by B5
= [(x ∧ x) ∨ (x ∧ x)] ∧ [(x ∧ x) ∨ (x ∧ x)∗ ] by B4
= (x ∧ x) ∧ [(x ∧ x) ∨ (x ∧ x)∗ ] ∨ (x ∧ x) ∧ [(x ∧ x) ∨ (x ∧ x)∗ ] by B4
= (x ∧ x) ∨ (x ∧ x) by B3
= [(x ∧ x) ∨ x] ∧ [(x ∧ x) ∨ x] by B4
=x∧x by B3
x ∧ z = (x ∧ y) ∧ z = x ∧ (y ∧ z) = x ∧ y = x,
(x ∧ y) ∧ x = x ∧ (x ∧ y) = (x ∧ x) ∧ y = x ∧ y,
so x ∧ y ≤ x. Obviously
(x ∧ y) ∧ y = x ∧ (y ∧ y) = x ∧ y
11
This means that z ≤ x and z ≤ y, so we have the two equations z = x ∧ z
and z = y ∧ z. We wish to show that z ≤ x ∧ y, which follows easily:
z ∧ (x ∧ y) = (z ∧ x) ∧ y = z ∧ y = z,
or equivalently z ≤ x∧y. Therefore x∧y = inf{x, y}. The proof that x∨y =
sup{x, y} is very similar, so we omit it for the sake of brevity. We have now
shown that hB, ≤i is a lattice. The fact that this lattice is distributive follows
from B4. All that remains is to show that this lattice is complemented. This
requires least and greatest elements, which are predictably going to be the
designated elements 0 and 1. For any z ∈ B, recall that by reflexivity,
z = z ∧ z. Then
0 ∧ z = (z ∧ z ∗ ) ∧ z by B5
= z ∧ (z ∧ z ∗ ) by B1
= (z ∧ z) ∧ z ∗ by B2
= z ∧ z∗ by reflexivity
=0 by B5
z ∧ 1 = z ∧ (z ∨ z ∗ ) by B5
=z by B3
shows that z ≤ 1 for all z ∈ B. The unary operator will yield complements:
for any z ∈ B we have the element z ∗ such that z ∨ z ∗ = 1 and z ∧ z ∗ = 0,
directly from B5. Therefore hB, ≤i is a complemented distributive lattice
which is of course a Boolean algebra.
12
Theorem 2.2.2. The principle of duality holds in all Boolean algebras.
Proof. This proof closely follows the argument in [1]. Suppose a statement
P holds in all Boolean algebras, and let hB, ≤i be any such Boolean algebra.
We will define a new relation ≤0 on B, such that
x ≤0 y iff y ≤ x.
This is evidently another partial ordering on B, where the join (meet) of x
and y with respect to ≤0 is precisely their meet (join) with respect to ≤. We
will show one of these facts to illustrate: to show that the new join x ∨0 y is
precisely the meet x ∧ y, first note that x ∨0 y = x ∧ y is in fact an upper
bound for {x, y} with respect to ≤0 . We have x ∧ y ≤ x and x ∧ y ≤ y, so
that x ≤0 x ∧ y = x ∨0 y and y ≤0 x ∧ y = x ∨0 y. Next let z be any upper
bound for {x, y} with respect to ≤0 . Then we have x ≤0 z and y ≤0 z, which
of course means that z ≤ x and z ≤ y. So z is a lower bound for {x, y} with
respect to ≤, so we must have that z ≤ x ∧ y. Then x ∨0 y = x ∧ y ≤0 z, so in
fact ∨0 = ∧ is well defined as a join operator with respect to ≤0 . A similar
argument shows that ∧0 = ∨ is well defined as a new meet operator with
respect to ≤0 . Also since 0 ≤ z ≤ 1 for all z ∈ B, we have 1 ≤0 z ≤0 0, so
that 1 and 0 are now the least and greatest elements with respect to ≤0 , so
we may as well call them 00 and 10 respectively. Complementation remains
the same; from B5 we have the least element 00 = 1 = x ∨ x∗ = x ∧0 x∗ , and
the greatest element 10 = 0 = x ∧ x∗ = x ∨0 x∗ .
Since P was assumed to hold in all Boolean algebras, it must hold in
hB, ≤0 i. But the coinciding of ∨0 = ∧, ∧0 = ∨, 00 = 1,and 10 = 0 means that
the dual of P must also hold in hB, ≤i.
To illustrate the convenience of duality, we will prove the following small
yet useful fact.
Proposition 2.2.3. For any elements x and y of a Boolean algebra B,
(x ∧ y)∗ = x∗ ∨ y ∗ .
Proof. We have
13
Also
(x ∨ y)∗ = x∗ ∧ y ∗ .
This follows immediately from Proposition 2.2.3 and the principle of du-
ality. For the remainder of this paper we will refer to duality very frequently;
for those who may be unsatisfied with the above proof of this principle, refer
to [2, Ch. 22] for a more detailed discussion.
A ∨ B = A ∪ B, A ∧ B = A ∩ B,
0 = ∅, 1 = X,
A∗ = X \ A.
Distribution of intersection over union and union over intersection are well
known set-theoretic facts. Complements of sets are also readily seen as
consistent with complementation as defined for a Boolean algebra.
14
Then h2, ≤i is a Boolean algebra called the minimal algebra. This minimal
algebra will come up quite often in our talk of homomorphisms later on;
in fact, this “trivial” algebra is fundamental in our connection of Boolean
algebra to propositional calculus.
Finally, the most important example:
Example 2.3.3. Let L be the propositional language described in Section 1
and let S be the set of all sentences of L. Define the relation ≡ such that
for x, y ∈ S,
x≡y iff ` x ⇔ y.
Then ≡ is easily seen to be an equivalence relation: reflexivity follows from
the tautology x ⇒ x; symmetry follows from T2 by considering x ⇔ y =
(x ⇒ y) ∧ (y ⇒ x); transitivity follows directly from A4.
For each x ∈ S, let |x| denote the ≡-class of x, and let B = {|x| : x ∈ S}
be the set of all ≡-classes. Define the relation ≤ on B by
|x| ≤ |y| iff ` x ⇒ y,
and define join, meet, and complementation as follows:
|x| ∨ |y| = |x ∨ y|, |x| ∧ |y| = |x ∧ y|,
|x|∗ = |¬x|.
Assign the values 1 = |x| for any theorem x and 0 = |y| for any y such
that ¬y is a theorem. Note that above, although the symbols are identical,
the first instances of ∨ and ∧ are newly defined, acting as join and meet on
B, while the second occurrences are the propositional connectives defined
in Section 1. We will show that these operations and assignments are well
defined in the following theorem. As an aside, the fact that these operations
are well defined is similar to saying that ≡ is a congruence relation, and
thus B is a quotient algebra. The only reason this is not exactly the case
is because the set of all sentences isn’t originally endowed with the Boolean
operations and ordering, so to say elements in the equivalence classes are
congruent with respect to those operations is not a properly stated sentence;
however, the equivalence classes themselves now act as the elements of a
Boolean algebra, as stated in the following theorem.
Theorem 2.3.4. The set B with relations and operations defined above is
a Boolean algebra, called the Lindenbaum algebra 1 of L [1].
1
The Lindenbaum algebra, or Lindenbaum-Tarski algebra is named for logicians Adolf
Lindenbaum and Alfred Tarski. It was first introduced by Tarski in 1935 as a device to
establish correspondence between classical propositional calculus and Boolean algebras.
The Lindenbaum-Tarski algebra is considered the origin of the modern algebraic logic [5].
15
Proof. We hope the reader is satisfied with the informal argument of ≡
being an equivalence relation; to be sure, a rigorous proof would actually
rely on formal deductions as in Section 1, which as we’ve already mentioned
are usually more tedious than enlightening. With that fact in hand, the
next step in this proof is to show that ≤ is well defined. This means that if
x2 ∈ |x1 | and y2 ∈ |y1 |, then
` x1 ⇒ y1 implies ` x2 ⇒ y2 .
and it follows from modus ponens that ` x2 ⇒ y2 . Therefore |x2 | ≤ |y2 | and
we conclude that ≤ is well defined.
Next we must show that the join operation is well defined. As above,
suppose that x1 ≡ x2 and y1 ≡ y2 . We must show that x1 ∨y1 ≡ x2 ∨y2 . Again
this is equivalent to showing that if x1 ⇔ x2 and y1 ⇔ y2 are theorems, then
so is x1 ∨ y1 ⇔ x2 ∨ y2 . As a few instances of A4 we have
` (y1 ⇒ y2 ) ⇒ (x2 ∨ y1 ) ⇒ (x2 ∨ y2 ) ,
` (x1 ⇒ x2 ) ⇒ (y1 ∨ x1 ) ⇒ (y1 ∨ x2 ) ,
` (x2 ∨y1 ) ⇒ (x2 ∨y2 ) ⇒ (x1 ∨y1 ) ⇒ (x2 ∨y1 ) ⇒ (x1 ∨y1 ) ⇒ (x2 ∨y2 ) .
Multiple applications of modus ponens and the commutativity in T1 will
yield that ` x1 ∨y1 ⇒ x2 ∨y2 . Switching around the indices in this argument
will also show that ` x2 ∨ y2 ⇒ x1 ∨ y1 , so that
` x1 ∨ y1 ⇔ x2 ∨ y2
16
To show that complementation is well defined, suppose x ≡ y. Then
` x ⇔ y, and it follows from A3 that
x ⇒ y = ¬x ∨ y
⇒ y ∨ ¬x
= ¬y ⇒ ¬x,
y ⇒ x = ¬y ∨ x
⇒ x ∨ ¬y
= ¬x ⇒ ¬y,
x1 ⇒ ¬y1 ≡ x2 ⇒ ¬y2 .
x1 ∧ y1 ≡ x2 ∧ y2 .
17
` y ⇒ (x ⇒ y),
` x,
` y,
and a few applications of modus ponens will show that ` x ⇔ y, so that
|x| = |y|. As mentioned above, it follows that both the least and greatest
elements are well defined.
Now that we know the definitions given in Example 2.3.3 are valid, we
will use Theorem 2.2.1 to show that these operations form a Boolean algebra
on B. Note that the theorems proven in Section 1.4 coincide directly with
the properties B1-B4 from Theorem 2.2.1. For example the fact that T1 is
a theorem shows that x ∨ y ≡ y ∨ x, so that
|x ∨ y| = |y ∨ x|
and thus
|x| ∨ |y| = |y| ∨ |x|.
In this way, the properties B1-B4 are immediate. We know from Exam-
ple 1.2.2 that ` ¬x ∨ x, and from above we have
18
so terribly cumbersome, this result really does allow us to use significant
machinery from algebra to simplify the concepts in a formal propositional
calculus.
Another familiar notion is the fact that anything is implied from a false
hypothesis. Formally, we might conjecture that ` (x ∧ ¬x) ⇒ y for any
sentences x and y. Of course, since 0 = |x| ∧ |x|∗ = |x ∧ ¬x| and 0 ≤ |y| for
all |y| ∈ B, we see that the conjecture is true. These facts not only satisfy
the craving of shorter and more elegant proofs, but also they help assure us
that we defined the propositional calculus correctly.
We are almost in a position to further explore the properties of the propo-
sitional calculus. The next section will again be in a general Boolean algebra
setting, and then we will return to the Lindenbaum algebra in Section 4.
3.1 Homomorphisms
The following are familiar definitions in any study of algebra.
Definition 3.1.1. A Boolean homomorphism is a function h : B → B 0 for
Boolean algebras B and B 0 such that for all x, y ∈ B,
ker f = f −1 [{0}].
In words, the kernel is the subset of the domain that maps to the least
element in the codomain of f . We also define the hull of f as the set
hull f = f −1 [{1}].
19
Note that these two definitions are dual to one another. That is, if some-
thing is true about the set of elements mapping to the least element in the
codomain of f , then there will be a parallel result for the set of elements
mapping to the greatest element in the codomain of f .
Proof. First note that we must have 0 ∈ ker f , because for any x ∈ B,
Also we see as a result from the dualism between kernels and hulls that
x ∈ ker f if and only if x∗ ∈ hull f , because
(i) 0 ∈ I,
20
(iii) if x ∈ I and y ∈ B, then x ∧ y ∈ I.
(i) 1 ∈ F ,
The previous corollary follows directly from Proposition 3.2.3 and the
principle of duality. We note that if the reader is unsettled by using the
principle of duality blindly, one can easily typographically take the dual of
the proof above and see that it is a valid proof for the corollary. Of course in
taking the dual we replace the word “kernel” with “hull,” and “ideal” with
“filter.”
We pause to make a few observations of ideals and filters. If I is an
ideal and F is a filter, note that condition (iii) implies that whenever 1 ∈ I,
it must be the case that I = B, in which case we say I is improper. Also
whenever 0 ∈ F , we have F = B and F is improper.
Also a very important characterization (from the definition of ideals and
filters as given in [1]) is the fact that for ideals, condition (iii) can be replaced
by
21
(iii) if x ∈ I and y ≤ x, then y ∈ I,
This is because
y≤x iff y =y∧x
and
x≤y iff y = x ∨ y.
This characterization will be used frequently, as it begets a nice picture: in
the lattice B, ideals are built from the top down while filters are built from
the bottom up. One might say that ideals are closed downward while filters
are closed upward.
One important ingredient before discussing the implications of this deep
connection between Boolean algebra and propositional logic is familiarity
with the concept of a filter generated by a subset of B. Although the alge-
braist is probably more comfortable working with ideals, we will be working
with filters in the Lindenbaum algebra, and the reason will become clear as
we go on. We will mainly be interested in proper filters; note that from the
reasoning above, we can say that F is a proper filter in B if and only if
(i) 1 ∈ F ,
(iv) 0 ∈
/ F.
This is a familiar definition, but the construction of this set in the next
theorem will be of great importance. The following definition and theorem
come from [1].
22
To make the following theorem less cluttered, we will adopt a notation
for finite joins and meets:
n
_
xi = x1 ∨ · · · ∨ xn
i=1
and
n
^
xi = x1 ∧ · · · ∧ xn .
i=1
x1 ∧ x2 ∈ F,
(x1 ∧ x2 ) ∧ x3 ∈ F,
and inductively we see that x1 ∧ · · · ∧ xn ∈ F . Since F is proper it must be
the case that x1 ∧ · · · ∧ xn 6= 0, so X does have the finite meet property.
(⇐=) Conversely suppose that X has the finite meet property. If X is
empty then of course X is contained in every filter of B, so suppose X 6= ∅.
Define
23
and
m
^ m
^
x0i ∧ y 0 = x0i .
i=1 i=1
Therefore
n m n m
" # " #
^ ^ ^ ^
xi ∧ x0i ∧ y ∧ y0 = xi ∧ y ∧ x0i ∧ y 0
i=1 i=1 i=1 i=1
n
^ m
^
= xi ∧ x0i
i=1 i=1
n
^ m
^
= xi ∧ x0i ,
i=1 i=1
3.3 Ultrafilters
Those with a knowledge of abstract algebra recognize the importance of a
particular type of ideal called a maximal ideal. Just as filters are dual to
ideals, an ultrafilter is dual to a maximal ideal. For reasons we will see in the
following section, ultrafilters are yet another key component in connecting
24
the propositional calculus to Boolean algebra. The definitions and results
in this section are mainly from [1].
Zorn’s Lemma. If a nonempty partially ordered set (P, ≤) has the prop-
erty that every nonempty chain (totally ordered subset) has an upper bound
in P , then P contains at least one maximal element. That is, P contains an
element m such that there is no x ∈ P for which m < x.
Proof. Let F be the set of all proper filters containing F ; then F is partially
ordered under set inclusion, as in Example 2.1.2. Also since F contains
itself, F is nonempty. Just as in [1] and [4], we will invoke Zorn’s Lemma on
chains in F. We must show that chains have upper S bounds in this setting.
Let C be a nonempty chain in F, and let M = C. We will show that
M is a proper filter containing F and thus an upper bound for the chain C.
We already have 1 ∈ M because 1 is in every filter and we are considering
a nonempty chain. If x, y ∈ M then we must have x ∈ X and y ∈ Y for
some filters X and Y in C. Since C is a chain we can assume without loss
of generality that X ⊆ Y . Then x, y ∈ Y and since Y is a filter, we have
x ∧ y ∈ Y ⊆ M . Also if x ≤ z for some z ∈ B, we have z ∈ Y ⊆ M since
Y is a filter. To be sure that M is proper, note that since every filter A in
the chain is proper, we have 0 ∈ / A for all A ∈ C. Therefore 0 ∈ / M , which
shows that M is a proper filter and M is clearly an upper bound for C.
By Zorn’s Lemma, we conclude that F has at least one maximal element
U , which is a proper filter containing F that is not properly contained by any
other proper filter containing F . However we cannot immediately conclude
that U is an ultrafilter, because we have only shown that U is maximal
among those proper filters containing F , not among every possible filter in
25
B. However, supposing there is a proper filter U 0 in B such that U ⊆ U 0 ,
we see that U 0 must also contain F . Since U is maximal among the proper
filters containing F , we have U = U 0 . Therefore, by Definition 3.3.1, we
conclude that U is an ultrafilter containing F .
Proof. Note that for any nonzero x, the singleton {x} has the finite meet
property. Therefore by Theorem 3.2.7 we know there is a proper filter F
containing x and by Theorem 3.3.2 there is an ultrafilter U containing F ,
so that x ∈ U .
x = x ∨ 0 = x ∨ (x∗ ∧ y)
= (x ∨ x∗ ) ∧ (x ∨ y)
= 1 ∧ (x ∨ y)
= x ∨ y,
which means that y ≤ x. But filters are closed upward, so this would imply
x ∈ F . So it must be the case that x∗ ∧ y 6= 0 for all y ∈ F . Thus we extend
F to a set
G = {z ∈ B : x∗ ∧ y ≤ z for some y ∈ F }.
Note that F ⊆ G because x∗ ∧ y ≤ y for any y ∈ F . Furthermore, we will
show that this set still has the finite meet property. Consider an arbitrary
collection z1 , z2 , . . . , zn ∈ G. For each zi we know that there exists yi ∈ F
such that x∗ ∧ yi ≤ zi . Then of course
n
^ n
^
∗
(x ∧ yi ) ≤ zi ,
i=1 i=1
26
and we find that
n
^ n
^ n
^
(x∗ ∧ yi ) = x∗ ∧ yi by B1 and B2
i=1 i=1 i=1
n
^
= x∗ ∧ yi
i=1
Vn
which is nonzero because i=1 yi ∈ F . Thus
n
^ n
^
∗
0 < x ∧ yi ≤ zi ,
i=1 i=1
which shows that G has the finite meet property. Thus by Theorem 3.2.7
there exists a proper filter H containing G and by Theorem 3.3.2 there exists
an ultrafilter U containing H, such that F ⊆ G ⊆ H ⊆ U for some ultrafilter
U . However we also have x∗ ∈ G from construction, so that x∗ ∈ U as well.
Then it must be the case that x ∈ / U , otherwise 0 = x ∧ x∗ ∈ U which would
contradict the ultrafilter being proper.
Although these results are dense, we will use all of them in the following
section, and they will make many theorems about the propositional calculus
much more simple and elegant. This remark is similar to the one made at the
end of Section 2, but the machinery here will be used in proving theorems
about the propositional calculus, not the theorems within the propositional
calculus from Definition 1.2.1.2 The next lemma and theorem will look very
familiar to those who have studied maximal ideals in algebra, as the dualism
is transparent.
2
One might say that these results will be used in proving metatheorems about L, as
opposed to proving formal theorems within L.
27
Lemma 3.3.6. Let U be an ultrafilter in B. Then if x ∨ y ∈ U , it must be
the case that x ∈ U or y ∈ U .
Proof. This proof follows the reasoning in [1]. Let x, y ∈ B such that x ∨ y ∈
U . Suppose that x ∈/ U and construct the set
F = {z ∈ B : x ∨ z ∈ U }.
First note that since U is a filter, for any z ∈ U we have x ∨ z ∈ U , so
we know U ⊆ F . If we can show that F is a proper filter, we can conclude
U = F . There are four conditions to show. First we know that 1 ∈ F because
x ∨ 1 = 1 ∈ U . Next consider any two elements z1 , z2 ∈ F . Since x ∨ z1 and
x ∨ z2 are both in the the filter U , we must have (x ∨ z1 ) ∧ (x ∨ z2 ) ∈ U . But
by B4 we know that
x ∨ (z1 ∧ z2 ) = (x ∨ z1 ) ∧ (x ∨ z2 ),
and therefore z1 ∧ z2 ∈ F as well. Next let w ∈ B such that z ≤ w for some
z ∈ F . Then w = z ∨ w and x ∨ z ∈ U . But since U is a filter we know that
(x ∨ z) ∨ w ∈ U , and
(x ∨ z) ∨ w = x ∨ (z ∨ w) = x ∨ w.
Hence x ∨ w ∈ U so we conclude that w ∈ F . Finally, note that x ∨ 0 = x
and recall that x ∈/ U . Hence 0 ∈ / F and we conclude that F is a proper
filter. But F contains the ultrafilter U , so it must be the case that F = U .
Recall that x ∨ y ∈ U , from which it follows that y ∈ F . Of course since
F = U , we have that y ∈ U as well.
28
2. If only one of x and y is in F , without loss of generality we will suppose
x ∈ F and y ∈ / F . Then x ∨ y ∈ F so that
However x ∧ y ∈
/ F , because x ∧ y ≤ y, and this would imply y ∈ F .
Hence
h(x ∧ y) = 0 = 1 ∧ 0 = h(x) ∧ h(y).
3. If both x, y ∈
/ F , then by the contrapositive of Lemma 3.3.6 we know
that x ∨ y ∈/ F . Therefore
4 Applying Algebra
For this section, let B denote the Lindenbaum algebra as defined in Exam-
ple 2.3.3. If h : B → 2 is a homomorphism, we call it a 2-valued homomor-
phism.
29
4.1 Making Connections
Recall from Definition 1.3.1 that a truth valuation v is a function mapping
each sentence of L into {0, 1} such that for all sentences x and y,
It should be clear from this definition that the following properties are also
true:
(y1 , y2 , . . . , yn ),
30
(A1) Suppose z = y ∨ y ⇒ y. If v(y ∨ y) = 0 then v(z) = 1. If instead
v(y ∨ y) = 1 we must have v(y) = 1 so that v(z) = v(y ∨ y ⇒ y) = 1.
as well.
v((x ⇒ y) ⇒ (w ∨ x ⇒ w ∨ y)) = 1.
Thus all axioms are tautologies, and this covers the base case.
Next suppose that n > 2 and for all natural numbers k < n, if there
exists a deduction of a sentence y of length k, then y. Now consider the
deduction of z which we denoted above as (y1 , . . . , yn ). Either z is an axiom,
which as we proved in the base case will imply that z, or z follows from
modus ponens from two sentences earlier in the sequence, say
yj and yk = yj ⇒ z,
are clearly deductions in their own right, with lengths j and k respectively.
Hence by the induction hypothesis, we know that yj and yk . Let v be
any truth valuation; we know from the previous sentence that v(yj ) = 1 and
31
v(yk ) = 1. But yk = yj ⇒ z, and since v(yj ) = 1, we must have v(z) = 1.
Since v was arbitrary we have z.
Therefore by induction on the length of the deduction of theorem z, we
have shown that ` z implies z.
The following proof necessarily conflates the “truth values” 0 and 1 with
the elements of the minimal algebra 2, which is not a problem but certainly
worth noting. Of course this doesn’t imply that the truth values from Section
1 inherently have these algebraic properties, but rather the definition of
a truth valuation restricts the function in such a way that the images of
sentences in {0, 1} behave like the minimal algebra.
in the minimal algebra, which of course means that v(¬x) = 1 if and only if
v(x) = 0. For the second property we also have
1 = v(x1 ⇔ x2 )
= v((x1 ⇒ x2 ) ∧ (x2 ⇒ x1 )).
32
Suppose for contradiction that v(x1 ) 6= v(x2 ), and without loss of generality
we will suppose further that v(x1 ) = 1 and v(x2 ) = 0. Then v(x1 ⇒ x2 ) = 0,
so that
v (x1 ⇒ x2 ) ∧ (x2 ⇒ x1 ) = 0,
which is a contradiction. Thus we must have v(x1 ) = v(x2 ), and we conclude
that h(x) is well defined.
Next we must show that h is a homomorphism into 2. So let |x|, |y| ∈ B.
Then
h |x| ∨ |y| = h |x ∨ y| = v(x ∨ y)
(
1 if v(x) = 1 or v(y) = 1
= .
0 otherwise
to hold true. We will consider the cases when v satisfies at least one of these
sentences, and when v does not satisfy either of them. In the first case,
suppose without loss of generality that v(x) = 1. Then from above we see
that h |x| ∨ |y| = 1, and also in 2 we have
Next we have
h |x| ∧ |y| = h |x ∧ y| = v(x ∧ y)
(
0 if v(x) = 0 or v(y) = 0
= .
1 otherwise
Similar to above, we will consider the cases where v satisfies both x and
y, and where v does not satisfy one or both of them. In the first case, if
33
v(x) = v(y) = 1, then we have from above that h |x| ∧ |y| = 1. Also in 2
we have
h(|x|) ∧ h(|y|) = v(x) ∧ v(y) = 1 ∧ 1 = 1.
In the second case suppose without
loss of generality that v(x) = 0. Then
from above we have h |x| ∧ |y| = 0 and in 2 we have
34
In this setting we will be considering the set of all consequences Ŝ of S.
Note that since 1 is always in Ŝ, and
1 = |¬(x ∧ y) ∨ (x ∧ y)|
= |(x ∧ y)|∗ ∨ |x ∧ y|
= (|x| ∧ |y|)∗ ∨ |x ∧ y|
= (|x|∗ ∨ |y|∗ ) ∨ |x ∧ y|
= |x|∗ ∨ (|y|∗ ∨ |x ∧ y|)
= |x|∗ ∨ |y ⇒ (x ∧ y)|
= x ⇒ (y ⇒ (x ∧ y)),
we see that if |x| ∈ Ŝ and |y| ∈ Ŝ, then |x ∧ y| = |x| ∧ |y| ∈ Ŝ as well. Also
if for some |x| ∈ Ŝ we have |x| ≤ |y|, by definition this means that ` x ⇒ y,
which of course means that |x ⇒ y| = 1. Hence |x ⇒ y| ∈ Ŝ and therefore
|y| ∈ Ŝ by the modus ponens rule. These facts show us that Ŝ is a filter!
Explicitly we have shown
(i) 1 ∈ Ŝ,
It is also clear that this is the smallest filter containing S, so that Ŝ is the
filter generated by S.
For the remainder of this paper, we will consider a set of sentences A in L
and define the corresponding subset SA = {|x| : x ∈ A} of the Lindenbaum
algebra B.
In the next few theorems we finally present the main results of this
paper, but first we need to assure ourselves that the existence of an algebraic
deduction in B of |y| from SA as in Definition 4.1.4 will imply the existence of
a deduction in L of y from A, as in Definition 1.2.3. Consider some algebraic
deduction (|x1 |, |x2 |, . . . , |xn |) of |y|, and let us analyze Defintion 4.1.4. For
any of the |xi | that are equal to 1, we know that by definition, ` xi . Thus
a formal deduction in L of y could include the deduction of xi of some
length li . If |xj | ∈ SA then we have xk ∈ A where xk ≡ xj . By definition,
` xk ⇔ xj , so we could include a formal deduction in L of xk ⇔ xj , and thus
with xk ∈ A, a formal deduction of xj from A of some length lj exists and
35
could be included in the deduction of y. Furthermore, if any |xi | followed
by modus ponens from two preceding terms |xj | and |xk | = |xj ⇒ xi |, by
the argument just given we could inductively deduce both xj and xj ⇒ xi
from A, resulting in a deduction of xi from A. Therefore, for any formula
|y| deduced algebraically from SA , there exists a deduction in L (albeit
likely much longer) of y from A. It is easy to see the converse is also
true, that if there is a formal deduction in L of y from A, then there is
an algebraic deduction (perhaps a quicker one) in B of |y| from SA . In fact
in this direction, we could just modify the terms of a deduction (y1 , . . . , yn )
simply by taking the equivalence classes (|y1 |, . . . , |yn |), which is evidently
an algebraic deduction.
With this assertion in hand, we begin to see the connections. A num-
ber of familiar concepts in propositional calculus can now be viewed alge-
braically. From the reasoning above, letting A be a set of sentences of L
and again SA = {|x| : x ∈ A}, we see that A ` y if and only if |y| is in the
filter generated by SA , that is |y| ∈ ŜA . This is the crux of the connection
between Boolean algebra and propositional logic – the algebraic properties
that bring about the closure of a filter in B correspond exactly to the rules
by which a theorem can be deduced from a set of sentences in L.
36
and by Theorem 3.3.2 we know there exists an ultrafilter U such that SA ⊆
ŜA ⊆ U . Then by Theorem 3.3.7 we know that U is the hull of some 2-
valued homomorphism h, so that h[SA ] = {1}. Then the truth valuation
v(x) = h(|x|) guaranteed by Theorem 4.1.3 is a truth valuation satisfying
A.
(i) A is consistent
(ii) A is satisfiable
Proof. We need only show (i)⇔(iii). Suppose SA has the finite meet property
in B. By Theorem 3.2.7 this is true if and only if ŜA is a proper filter. As
stated in Section 3.2, ŜA is proper if and only if 0 ∈
/ ŜA . For any sentence
x, we have 0 = |x| ∧ |x|∗ = |x ∧ ¬x|, so this means that A 0 x ∧ ¬x for any
sentence x. By definition this means that A is consistent.
37
Proof. Again let SA = {|x| : x ∈ A} and note that A ` y if and only if
|y| ∈ ŜA . As we saw in the constructive characterization in Theorem 3.2.7,
|y| ∈ ŜA if and only if there exist |x1 |, . . . , |xn | ∈ SA such that
Since
|x1 | ∧ · · · ∧ |xn | = |x1 ∧ · · · ∧ xn |,
by definition this means that
` (x1 ∧ · · · ∧ xn ) ⇒ y.
The next theorem combines the Strong Soundness and Strong Complete-
ness Theorems.
38
We can also prove the Compactness Theorem through such algebraic
means. There are two equivalent statements of Compactness, and there are
enlightening approaches to each proof.
Theorem 4.2.5. For any sentence y and set of sentences A in L, we have
A y if and only if there is a finite subset {x1 , . . . , xn } of A such that
{x1 , . . . , xn } y.
Proof. (=⇒) Suppose that A y. By the Completeness Theorem we have
that A ` y, which means that |y| ∈ ŜA . As seen in Theorem 3.2.7 there is a
finite subset {|x1 |, . . . , |xn |} of SA such that
|x1 | ∧ · · · ∧ |xn | ≤ |y|.
Of course this implies that |y| is in the filter generated by {|x1 |, . . . , |xn |},
so that {x1 , . . . , xn } ` y. By the Soundness Theorem, it follows that
{x1 , . . . , xn } y.
(⇐=) Conversely, suppose that there is a finite subset {x1 , . . . , xn } of A
such that {x1 , . . . , xn } y. From the Completeness Theorem we know
{x1 , . . . , xn } ` y which means |y| is in the filter F generated by {|x1 |, . . . , |xn |}.
Since {|x1 |, . . . , |xn |} ⊆ SA , it is clear that F ⊆ ŜA so that |y| ∈ ŜA . Hence
A ` y and again from Soundness we conclude A y.
5 Conclusion
We have found that many common notions in propositional calculus are just
disguised, yet familiar algebraic notions. Although it may seem that the
39
necessary machinery in Boolean algebra is too much trouble for such results
that could be proven strictly within propositional calculus, the insight to
these connections is certainly enlightening. Furthermore, theorems such as
Completeness and Compactness are significantly more difficult in a first-
order logic, but the theory of ultrafilters in a Boolean algebra extends quite
easily into the first-order logic setting. Thus the main ideas presented in
this paper actually have much deeper implications than the few proofs in
Section 4.2, and establish a great introduction to further work in Boolean
algebra applied to first-order logic.
40
References
[1] Bell, J. L., and Moshe Machover. A Course in Mathematical Logic.
Amsterdam: North-Holland Pub. Co., 1977.
[2] Halmos, Paul R., and Steven R. Givant. Logic as Algebra. 21 Vol. Wash-
ington, D.C.: Mathematical Association of America, 1998.
41