# Representing Boolean Operations as Boolean Polynomials

In 1927, Russian Mathematician Ivan Zhegalkin showed that all Boolean operations could be represented by Boolean polynomials. What does this mean?

The reader may be familiar with a polynomial of the form \(x^2 -2x + 1\), a common quadratic polynomial; the kind we set equal to zero, \(x^2 -2x + 1 = 0\) and use the quadratic formula to find the roots.

Suppose we have a quadratic polynomial of the general form \(ax^2 + bx + c\). What would make it a Boolean polynomial? The short answer is when we restrict coefficients \(a,b,c\) to be either 0 or 1 and solutions for \(x\) are either 0 or 1. But if \(x\) can only be zero or one, then \(x^n = x\) for any power \(n\). So Boolean polynomials of a single variable \(x\) are fairly trivial, i.e. simply \( ax + b\).

Higher degree polynomials must have more variables, say in terms of \(x,y,z\), and this gives us Boolean polynomials of the form \(xyz + xy + xz + yz + x + y + z + 1\). If we have something like \( x^3 y^2 z\) then it reduces to just \(xyz\).

These reductions tend to simplify Boolean polynomials very quickly, for example, when multiplying and expanding two polynomials we get:

\( \begin{align} (x + y + 1)(y + z) & = xy + xz + y^2 + yz + y + z &\\

& = xy + xz + yz + y + y + z & \text{(because $y^2 = y$)}\\

& = xy + xz + yz + z & \text{(because $y + y = 0$)}\\

\end{align}\)

Really? \(y + y = 0\)? Yes, because \(y\) is a variable for 0 or 1. If \(y = 0\) then we get \(0 + 0 = 0\). What about if \(y = 1\)? Well, \(1 + 1 = 2\), but we are dealing in Boolean numbers, which means \(1 + 1 = 0\). We often think of addition in Boolean numbers to be *addition modulus (or mod) 2.* That is, the remainder of 2 divided by 2 is zero.

We can represent the adding of two Boolean values in a "truth" table. For addition, we get the following:

</p>

\( \begin{array}{ c | c c } + & 0 & 1 \\ \hline 0 & 0 & 1 \\ 1 & 1 & 0 \end{array} \)

Which is exactly the truth table for the Boolean operation XOR (exclusive-or). What about the truth table for the multiplication in Boolean polynomials? Anything multiplied by 0 is 0 and 1 times 1 is 1. So the truth table for multiplying two boolean values is:

\( \begin{array}{ c | c c } \cdot & 0 & 1 \\ \hline 0 & 0 & 0 \\ 1 & 0 & 1 \end{array} \)

Which gives us the truth table for the Boolean operation AND. Remember our goal is to represent Boolean operations as Boolean polynomials. So far we have shown that Boolean polynomials represent the Boolean operations AND and XOR. What about the logical operation OR, and the unary NOT/Negation?

We'll start with NOT, often represented as \( \neg x \). Can we express NOT in terms of multiplication and addition (AND and XOR) in a Boolean polynomial? Yes, by adding 1 (or XORing with 1). In our Boolean polynomial 1 + 1 = 0 and 0 + 1 = 1. So adding 1 to any Boolean polynomial is equivalent to the logical NOT.

The logical OR operation, often written \( x \vee y \) for x OR y, is a little trickier but we can represent OR in terms of our Boolean polynomial as follows:

\( x \vee y = xy + x + y \)

I will leave it to the reader to confirm this result by comparing the truth table of OR with the possible results of the Boolean polynomial on the right hand side.

This means we can write any Boolean operation involving XOR, AND, OR, and NEG in terms of an equivalent Boolean polynomial. Writing a logical expression as a Boolean polynomial is usually called Algebraic Normal Form, or ANF for short.

Writing a logical expression, say x OR y in WolframAlpha will give you a table of logical form equivalences, including the ANF.