# ⊙ AntiQuark

Truth, Beauty, Charm, Strange

## 2006/01/15

### Proof of DeMorgan's Theorem

For quite a while I've been looking for a purely algebraic proof of DeMorgan's theorem. I know that it can be proven simply by filling out a truth table, but that seems so... inelegant. Plus, the truth-table proof only applies to logics with values of '1' and '0' or 'true' and 'false'. In more advanced mathematics, a Boolean algebra (or 'lattice' as it is sometimes called) might permit more than just 'true' and 'false' values.

The laws (or axioms) of Boolean algebra are:

• The absorption laws,
a ^ (a v b) = a
a v (a ^ b) = a

So called because the 'b' is absorbed into the 'a', which is the only thing that remains.

• The distribution laws,
a ^ (b v c) = (a ^ b) v (a ^ c)
a v (b ^ c) = (a v b) ^ (a v c)

So called because the 'a' is distributed across the 'b' and 'c' values.

• The complementation laws,
a ^ ~a = 0
a v ~a = 1

Which specifies the result of and'ing or or'ing a value with its negation.

Also, there are the obvious commutative and associative laws, for example, a ^ b = b ^ a; a ^ (b ^ c) = (a ^ b) ^ c, etc.

The symbols are used like so: '^' means 'and', 'v' stands for 'or', '~' (also known as 'not') indicates that the complement is being taken, and '1' and '0' are the minimum and maximum elements of the logic.

I found I was able to prove a bunch of the other housekeeping theorems just by working the equations and plugging in axioms to get from point A to point B. For example, the boundedness identity, a ^ 1 = a, can be proved by starting with a ^ 1, replacing 1 with a v ~a (due to complementation), then using absorption on a ^ (a v ~a) to get a.

DeMorgan's theorem, on the other hand, seemed to be a much tougher nut to crack. Recall that there are two version of DM:

~a ^ ~b = ~(a v b)

and
~a v ~b = ~(a ^ b)

(For simplicity, I'll just focus on the first one from now on.)

I've tried, with no luck, to take ~a ^ ~b and work it into ~(a v b). The big problem is that none of the axioms gives you a direct way of "encasing" something complicated into a "not" operation. (If you don't get what I'm talking about, give it a try... you'll see what the problem is.)

The complementation laws, a ^ ~a = 0 and a v ~a = 1 would be useful if we could take advantage of the negated term. In fact, these two laws happen to be the key to proving DM. We can use them to determine if a = b by evaluating 'a' along with ~b. Specifically, we want to show that

if a ^ ~b = 0 and a v ~b = 1,
then a = b.

Once we've proven this lemma, we can use it to further prove DM.

We'll use the Boolean laws mentioned above, plus two extra premises; namely, a ^ ~b = 0 and a v ~b = 1. If we start with 'b' we should be able to work our way to 'a', which would prove our conjecture.

The proof goes as follows:

 b (start with 'b') = b ^ 1 (identity law) = b ^ (a v ~b) (substitute first premise) = (b ^ a) v (b ^ ~b) (distribution laws) = (b ^ a) v 0 (complementation law) = (b ^ a) v (a ^ ~b) (substitute second premise) = a ^ ( b v ~ b) (distribution law) = a ^ 1 (complementation) = a (identity law)

Thus, whenever a ^ ~b = 0 and a v ~b = 1, then a is equal to b.

We will use this fact to prove DeMorgan's theorem by substituting (~c ^ ~d) into a and ~(c v d) into b. Thus, if we're able to prove that

(~c ^ ~d) ^ ~~(c v d) = 0

and
(~c ^ ~d) v ~~(c v d) = 1

then we will have shown that
~c ^ ~d = ~(c v d).

An initial simplification is due to the involution, or double negation law, a = ~~a. We can replace ~~(c v d) with (c v d) in both of the above equations. Working out the first one, we get

 (~c ^ ~d) ^ ~~(c v d) = (~c ^ ~d) ^ (c v d) (double negation) = ((~c ^ ~d) ^ c) v ((~c ^ ~d) ^ d) (distribution) = (~c ^ c ^ ~ d) v (~c ^ d ^ ~ d) (associative, commutative laws) = (0 ^ ~d) v (~c ^ 0) (complementation) = 0 v 0 (boundedness) = 0 (idempotence)

Similarly, we can show that (~c ^ ~d) v ~~(c v d) = 1.

Hence, we can conclude that

~c ^ ~d = ~(c v d).

NOTE: Some of the following "housekeeping" theorems were mentioned in the preceding proof. Most of them are straighforward to prove. Involution is a little tricky; you have to use 1 = ~a v ~~a at one point.

Idempotence
a ^ a = a
a v a = a

Boundedness, Identity
a v 0 = a
a v 1 = 1
a ^ 0 = 0
a ^ 1 = a

Involution or Double Negation
~~a = a

• At 1/15/2006 11:27 PM, Aleph said…

Hello,

I'm pleased to see a logic-themed post on your blog. Might I ask what you're going to be using logic for? Are you solving logic problems just for fun, or are you working on a project involving logic?

If you're ever interested in getting a good book on logic I'd like to recommend Logic: Techniques of Formal Reasoning by Kalish, Montague, and Mar.

This is the book I'm learning logic with, and in the 2nd chapter (out of 11) it teaches you sentential logic, so that short proofs (like the ones for DeMorgan's Laws) should take you only about a minute to do. Here's how I did half of one of DeMorgan's Laws (in the book's syntax):

1  Show ¬P?¬Q?¬(P?Q)  5 CD
2           ¬P?¬Q                ASS CD
3           ¬P                      2 S
4           ¬Q                      2 S
5           Show   ¬(P?Q)     7,8 ID
6                         P?Q      ASS ID
7                         Q          3,6 MTP
8                       ¬Q          4 R

The annotations that justify each step are on the right side of each line. With "ASS CD" we assume the antecedent, while with "ASS ID" we assume the negation. "S" is for simplification, "MTP" is the "modus tollendo ponens" rule that allows you to get one disjunct by negating the other disjunct of a disjunction, and we use "R" to repeat an antecedent line. We show that we've completed a derivation or subederivation with "CD" when we've shown the consequent of a conditional, and "ID" when there's a contradiction.

I really can't recommend this book highly enough. It's really thorough and teaches you how to do logic step-by-step. It has lots of examples too. In fact, there's even a free companion program that you can get from here, that'll speed up your learning time immensly. Of course, taking a course in logic is even better... but for self-study this is the next best thing, IMO.

Anyway, have fun... and keep up the great work with the blog! :)

• At 1/15/2006 11:44 PM, Aleph said…

Ooops... looks like Blogger didn't like some of the HTML symbols I used (though it looked fine in the preview), so let me try this again with "^" for conjunction, "v" for disjunction", and "->" for conditional:

1  Show ¬P^¬Q->¬(PvQ)  5 CD
2           ¬P^¬Q                ASS CD
3           ¬P                      2 S
4           ¬Q                      2 S
5           Show   ¬(PvQ)     7,8 ID
6                         PvQ      ASS ID
7                         Q          3,6 MTP
8                       ¬Q          4 R

• At 1/16/2006 11:21 AM, Derek said…

I'm pleased to see a logic-themed post on your blog. Might I ask what you're going to be using logic for? Are you solving logic problems just for fun

I'm not a mathematician, but I've been interested in higher mathematics for a long time (group theory, set theory, etc). I've tried teaching myself those topics by reading books, but I always hit a wall of some sort.

I finally figured I'll make more progress if I start at the fundamentals (in this case, logic) and work my way up.

To answer another part of your question: yes, I think it's a lot of fun!

For the logic characters, I used the words and, or, not with an ampersand in the front and a semicolon following. Let's see of that works:

& and ; = ?
& or ; = ?
& not ; = ¬

That book you refer to looks good. The book I'm using is
this book
which I got for a philosophy course in university many years ago. It looks like it's not in print anymore.

Thanks for the comments!

• At 1/26/2006 2:49 AM, burton mackenzie said…

now do it in balanced ternary. :-)

• At 7/08/2007 11:18 PM, Anonymous said…

This might not be relevant to the original problem, but DM can be easily proven if your system includes Reductio.

• At 7/09/2007 9:15 PM, Derek said…

Thanks for the comment, I found a page that uses reductio here:
http://www.physicsforums.com/archive/index.php/t-72007.html

• At 4/08/2008 8:01 PM, Anonymous said…

A mathematician by the name of George Spencer Brown, once wrote a book called "The Laws of Form". It deals with totally primitive concepts, which are so primeval they are mind stretching. For example his calculus of indications is based on the idea of marking a distinction in space, and it has just two axioms -what it means to "to call" and what it means "to cross" a distinction. Out of this he proved De Morgan's laws elegantly. He also applied his ideas to signal theory and cybernetics.