For a number of games under development, I have a skill system where players select their bonus skills. Many of the skills have prerequisites. Each skill gets assigned a bit within an integer (if there were more than 32 bits, more than one integer could be used, though for me this isn't an issue as skills are grouped into set that contain 16 skills so only 16 bits are needed). Each skill record has an integer for prerequisites. The character class has an integer for each skill set indicating what skills are known. To see if a skill can be learned, a simple logical AND can be done on the prerequisite and the characters skill set. If the result equals the prerequisite, the skill can be learned.
This can be done without binary logic operators by using an array of booleans. Checking prerequisites the becomes a bit more complicated. First, a meets-prerequisites flag is set to true. A for loop looping through the prerequisites array checks if the corresponding skill flag is set. If it is set, the array of skills in the character sheet then gets checked. If the character sheet skill is false, the meets-prerequisites flag is set to false and the for loop is exited.
As many languages use integers to represent Boolean variables, this is a potential waste of 31 bits. In our example, this would be a memory increase of 16 times the amount. Some compilers have optimizations for memory use in which case the compressing of the bits into a single integer will be done by the compiler, but even in that case the compiler will still have to use the looping code for checking prerequisites with the rather significant speed difference as a result. Of course, in addition to having lots of memory, computers also are very fast now so such concerns can be overlooked. But as writing the code to be efficient in the first place is so easy there really is no excuse for such a waste of memory and CPU cycles.