Operations between boolean variables
The previous time, we talked extensively about Boolean variables, trying to outline the main operations that can be carried out at a practical level.
Of all the cases examined, we have not examined the most important and most recurrent one: the case in which the conditions to be examined are more than one and, above all, when there is some relationship between them.
To resolve these situations, George Boole in 1847 invented a type of algebra commonly called Boolean algebra. In literature, Boolean algebra is defined as algebra in which the values, called truth values, can only assume "true" or "false" values, denoted with 1 and 0 respectively. It can be immediately noted that the "protagonist" entities of Boolean algebra are those we have defined as Boolean variables.
Boolean operators
As in traditional algebra, also in Boolean algebra it was necessary to define operators in order to carry out the operations between values of truth.
Before beginning the examination of the three fundamental operators, a new term should be introduced. Let us define as joint each member of a logical expression.
And - logical product
The first operation we examine is that of the logical product.
The situation is one in which we want to check that two or more conditions are true at the same time. We therefore want to verify that a series of conditions have occurred. The operator and will then return true if all the relatives of the condition are true at the same time.
The behaviour of a logical operator can be summarized through a double entry table. So let's see how the and operator behaves with two variables.
A | B | A and B |
true | true | true |
true | false | false |
false | true | false |
false | false | false |
From this table you can see the behaviour mentioned above. The result of the and between two variables is true if and only if both relatives have the true truth value.
The and operator also has several properties. Below I'll list only those that are useful and essential for developing working code:
- commutative: A and B = B and A. Banally, it means that the order in which we execute the logical conjunction (alternative way of calling the and) is irrelevant;
- associative: A and (B and C) = (A and B) and C. This property tells us that whatever way we carry out a series of conjunctions, the result will always be equivalent.
The syntax in Java useful to express this operator is as follows:
condition1 && condition2 && ... && conditionn
So we see that in Java, the operator and can be expressed as follows: &&.
Let's see a simple example
int a = 5; int b = 7; if(a > 3 && b < 10){ ... }
In this particular case, we have two relatives that are a > 3 and b < 10. Evaluating them separately we obtain that both are true. We then get an expression of the true and true type, which returns true.
OR operator - logical sum
The second fundamental operator of Boolean algebra is the operator or, also called logical sum or logical disjunction.
The logical disjunction returns the true value if and only if at least one of the relatives is true. Therefore, in the moment in which we have to evaluate a condition of the type condition1 or condition2 or ... or conditionN, the result will be true if and only if at least one of the conditions is true.
Let's analyse the operator's truth table.
A | B | A or B |
true | true | true |
true | false | true |
false | true | true |
false | false | false |
From this table we can see that what was said before is true. The logical disjunction of two or more operands is true if and only if at least one of the conjoined is true.
Just as the and, also the or has some properties that can be useful for development purposes. Let's see a couple of them:
- commutative: this property is similar to that of the logical product. It tells us that A or B = B or A.
- associative: it tells us that A or (B or C) = (A or B) or C. Wanting to read it with a natural language, it tells us that we can associate the conjoined in any way and make the disjunctions, obtaining an equivalent result.
This operator can be expressed by the following operator: ||.
Let's see an example similar to the previous one.
int a = 5; int b = 7; if(a > b || b < 10){ ... }
In this case, the assessment of the condition returns true, despite the fact that one of the two relatives is false. In fact, a is not greater than b. In spite of this, the second joint is true, therefore all the disjunction is true, since it is referable to an expression of the type false or true, which returns true.
Operator NOT - logical negation
This is the last basic operator and is the only one we have somehow already seen. The operator not, basically, acts as an inverter. When it has true input, it returns false and vice versa.
Let's see the truth table of logical negation.
A | NOT A |
true | false |
false | true |
Now we can finally understand the notation given last time, which for your convenience I report below.
But first we need to know the syntax to use to express the operator not. Well, we have already seen this syntax, which uses the operator! (exclamation mark).
int a = 5; boolean condition = a < 3; if(!condition){ ... }
We can finally understand the meaning of what has been written. The if condition will be translated as
if(!false)
From the table given before, you can see that !false = true. We then get a code like
if(true)
which is a valid and working writing.
Lazy evaluation order
We have introduced the three main logical operators. To conclude this first overview, it is good to introduce one last concept, which is the lazy evaluation order.
This means that, when one and/or are evaluated, a technique is used such that the number of relatives evaluated is the minimum. Let's take a closer look:
lazy evaluation of the logical conjunction: the logical product (and) is evaluated until the examined conjunction is true. When a false joint is found, the evaluation of the whole condition is interrupted.
lazy assessment of logical disjunction: the logical disjunction (or) is assessed until a true joint is found. When a true relative is found, the assessment is discontinued and the entire condition is assessed as true.
- lazy evaluation of the logical conjunction: the logical product (and) is evaluated until the examined conjoineds are true. When a false joint is found, the evaluation of the whole condition is interrupted.
- lazy evaluation of logical disjunction: the logical disjunction (or) is assessed until a true joint is found. When a true relative is found, the assessment is discontinued and the entire condition is assessed as true.
This makes us understand that, when we have many conditions in and between them, it is good to write at the beginning those that are more likely to be false, so as to avoid long evaluations. On the contrary, with the disjunctions, it is preferable to write first the conditions that are more likely to be true.
Alessio Mungelli
Computer Science student at UniTo (University of Turin), Network specializtion, blogger and writer. I am a kind of expert in Java desktop developement with interests in AI and web developement. Unix lover (but not Windows hater). I am interested in Linux scripting. I am very inquisitive and I love learning new stuffs.
Related Posts
A Java approach: While loop
Hello everyone and welcome back! After having made a short, but full-bodied, introduction about cycles, today we are finally going to see the first implementations that use what we have called…
A Java approach: The Cycles - Introduction
Hello everyone and welcome back! Until now, we have been talking about variables and selection structures, going to consider some of the fundamental aspects of these two concepts. Theoretically, to…
A Java Approach: Selection Structures - Use Cases
Hello everyone and welcome back! Up to now we have been concerned to make as complete an overview as possible of the fundamental concepts we need to approach the use…
A Java approach: conditional structures
Hello everyone and welcome back! The previous times we have introduced the concept of variable, trying to define some basic concepts about it. However, some situations suggest that the concept of…
Hashmap: hashing, collisions and first functions
Today we are going to study some concepts closely related to hashmaps. The concepts we are going to see are hashing and collisions. Hashing The idea of hashing with chaining is to…
Hashmap: Overflow Lists
In this short series of articles we will go to see how it is possible to create the Hashmap data structure in C. In the implementation we're going to use the…
Data structures in Java - Linked Lists
With 2020 we are going to look at a new aspect of programming: data structures. It is often the case that everyone uses structures provided by the various programming languages.…
A Java approach: variables - use case
Hello all friends and welcome back! After the introduction made on the variables, we try to analyse some critical issues that may arise in quite common situations. Let's start by analysing…
A Java approach: variables
Hello everyone and welcome back! Today we will begin a journey that will lead us to study, and possibly review, what are the basics of programming. We will start by talking…
Is JavaScript good for machine learning?
One of the things you always hear when you are talking to someone related to the M.L. world is that, one must learn Python because the vast majority of the…
Java Design Pattern: Builder Pattern
Today we are going to talk about a creational pattern that in many situations can represent a useful alternative to the construction of the objects using the constructors: the Builder…
Java Design Pattern: Strategy Pattern
One of the most popular patterns is the Strategy Pattern. It is also one of the easiest patterns. It is a member of the behavioral patterns family, it has the duty…