Computers are dumb.
Computers follow very rigid patterns of logic, which we call programs. Without understanding this logic, we cannot cause a computer to perform even the simplest tasks, leaving us without the foundation for the more complicated tasks.
What is logic
Logic is, fundamentally, a system of rules by which we can derive certain truth. It was first formulated by Aristotle, who proposed the idea of the syllogism – a series of three statements which, provided the first two are true, ensures the truth of the third. This remains the basis of logic, as we can build complicated proofs using this model.
Consider the following classic proof:
Aristotle is a man.
All men are mortal.
Therefore, Aristotle is mortal.
The first and second statement (premises) are almost certainly true. Based on the truth of the first two, we can conclude that the third (conclusion) is true.
This is how we employ logic for computers, as well.
The first and second laws of logic
FIRST: A thing is always the same as itself.
That is to say, 0 is equal to zero is equal to 1-1. If we ask whether an apple is an apple, the logical answer is always YES.
This leads to the second law:
SECOND: A thing cannot be and not be at the same time, in the same way.
That is, I cannot be both human and not human at the same time. I cannot eat fish and not eat fish at the same time. I cannot walk and not walk at the same time.
Consider the following premises:
I eat when I am hungry.
I am hungry.
Based on these premises, the logical conclusion is that I will eat.
If we negate the second premise (“I am NOT hungry”), the conclusion is that I will not eat.
Complicating it: OR, AND, XOR
In logical terms, OR is true when either of the conditions provided is true.
Consider the following:
I eat when I am hungry or I am bored.
I am eating.
The conclusions that make this syllogism true are:
I am hungry.
I am bored.
I am hungry and I am bored.
Logical AND is true only when both conditions are true.
Consider the following:
I play video games when I am tired and when I have free time.
I am playing video games.
There is only one logical conclusion you can draw from these premises:
I am tired and I have free time.
XOR: It’s Either-Or
In English, we occasionally use the word “or” to mean “either-or”. In logic, we
call this the EXCLUSIVE OR, or XOR.
Consider this syllogism:
I write blog posts on my desktop XOR my laptop.
I am writing my blog post.
The logical conclusion is that I am EITHER on my laptop OR on my desktop, but no
Putting it in code
That’s all well and good, you might say, but I’m here to learn programming, not philosophy. Here’s how these terms look in C syntax:
!= NOT EQUAL
if(x) “If x is true, do something”
Simple enough, right? Let’s look at some simple examples:[code language=”c”]
printf("%d != 1", i);
if(i == 1 || j == 1)
printf("%d OR %d is equal to 1", i, j);
if(k == 2 && i == 1)
printf("%d is 2 and %d is 1", k, i);
else if(k ==2 && i == 0)
printf("%d is 2 and %d is 0", k, i);
if(m == 4 ^ k == 2)
printf("The power of XOR!");
if( m == k * k || i == 1 && j == 1)
printf("What’s the result?");
In Example 1, we are testing whether i is not equal to 1. Because i is equal to 0, we know that the program will print our message.
In Example 2, we are testing whether i or j is equal to 1. Because j is equal to 1, we know that the program will print our message.
In Example 3, we are testing whether k is 2 and i is 1. Because i is not equal to 1, the test will fail, and the program will not print our message.
In Example 4, the result is true only when either m is 4 or k is 2, but not when both of these are true. Because m is 4 and k is 2, the program will print “Whoops.”, not “The power of XOR!”
Finally, in Example 5 we start to get a bit crazy. Try to figure it out for yourself, and we will discuss it in the next post.