(written long time ago, publish or languish)
These are some notes I made for Davide Radaelli for the first section of Schuller’s lectures on physics.
Let’s turn Boolean algebra into something we know better: arithmetic.
First we’ll set False to 0 and True to 1. To handle overflow, any arithmetic is mod 2. So even numbers are \(0\) and odd numbers are \(1\).
There are 4 unary operators:
- 2 constant functions: \(x \rightarrow 0\), \(x \rightarrow 1\)
- identity: \(x \rightarrow x + 0\). I like to write it as \((+0)\)
- negation: \(x \rightarrow x + 1\). I like to write it as \((+1)\) or \(-x\).
Now for binary operators: and
and or
.
and
can be represented as multiplication. But or
cannot be represented as
addition. True or True is True, but \(1 + 1 = 2 = 0 =\) False. That’s no good.
The annihilation properties of and
and or
suggest a way out.
True or \_
is always True, False or x
is the identity function on x.
False and _
is always False, True and x
is the identity function on x.
Notice the duality between these properties. It’ll come in handy in a moment.
The solution for how to represent these popped into my head randomly, which I dislike intensely, since that’s not a reliable problem solving method. But my subconscious doesn’t care.
We can use min and max. \(x\) and \(y\) is \(\min(x, y)\), \(x\) or \(y\) is \(\max(x, y)\). The properties above are represented as:
- \[\max(1, -) = 1\]
- \[\max(0, x) = x\]
- \[\min(0, -) = 0\]
- \[\min(1, x) = x\]
and
is interesting because it can be represented 2 ways, as min or as multiplication.
But what’s +
then? Thinking categorically, sums are coproducts, which are on
the side of math that encodes the idea of single choices, any rather than all.
Closest match is exclusive or
. Turns out that works (check the truth table
yourself to be sure).
The last big boolean operator to handle is implication. It’s exponentiation.
\(a\) implies \(b\) is the same as \(b^a\), at least if \(0^0\) is understood to be \(1\), which combinatorics suggests. It satisfies all the relevant identities, except that \(a ^{(b+c)} \neq a^b + a^c\). It does work if you use or instead of xor though.
Here’s some Julia code to test it out. I manually checked the basic implication truth table.
using RandomizedPropertyTest
@quickcheck (c^(a * b) == (c^a)^b == (c^b)^a) ((a, b, c)::Bool)
@quickcheck (c^(a ⊻ b) == (c^a * c^b) == (c^b * c^a)) ((a, b, c)::Bool)
@quickcheck (c^(a | b) == (c^a * c^b) == (c^b * c^a)) ((a, b, c)::Bool)
@quickcheck (a * (b | c) == (a * b) | (a * c)) ((a, b, c)::Bool)
@quickcheck (a | b == max(a, b)) ((a, b, c)::Bool)
@quickcheck (a & b == min(a, b) == a * b) ((a, b, c)::Bool)
@quickcheck (a * (b ⊻ c) == (a * b) ⊻ (a * c)) ((a, b, c)::Bool) # doesn't hold in all cases, such as a,b,c=1,1,0
Predicate Logic
Let’s explore \(\exists\) and \(\forall\).
Dually, \(\forall\) is interpreted as ‘all’. We can understand it as a product, using intersection as multiplication. Empty products are interpreted as True.
Example: Consider a set we’re quantifying over is \(\{1, 2, 3\}\). Let \(P\) be a predicate that is true for 1 and false for 2 and 3. Then, we’d express \(\forall x P(x)\) as \(P(1) \land P(2) \land P(3)\). If at least one of these is false, then the entire expression is false, since you’re effectively multiplying by 0.
\(\exists\) is interpreted as ‘exists’ or ‘any’. Its analogy to a sum is that empty sums are False. All \(\exists\) statements about the empty set are automatically false. Instead of xor (the sum), we use or (max).
Example: If the set we’re quantifying over is \(\{1, 2, 3\}\), and \(P\) is defined to be true for 1 and false for 2 and 3, then we’d express \(\exists x P(x)\) as \(P(1) \lor P(2) \lor P(3)\). If at least one of these is true, then the whole expression is true.