Catsunami Catsunami - 3 months ago 9
C Question

Does "&" vs. "&&" actually make a difference for compile-time flags?

I have a habit of using the following syntax in my compile-time flags:

#if (defined(A) & defined(B))


It's usually suggested that I do it with the
&&
as follows:

#if (defined(A) && defined(B))


I know the difference between the two operators, and that in normal code
&&
would short-circuit. However, the above is all handled by the compiler. Does it even matter what I use? Does it affect compile time by some infinitesimal amount because it doesn't evaluate the second
define()
?

Answer

Since defined(SOMETHING) yields 0 or 1, so that you're guaranteed 0 or 1 on both sides, it doesn't make a technical difference whether you use & or &&.

It's mostly about good habits (using & could carry over to some situation where it would be wrong) and about writing code that is easy to grasp by simple pattern matching. A & in there causes a millisecond pause while one considers whether it possibly could be a bit-level thing.

On the third hand, you can't use keyword and, which you ¹can use in ordinary C++ code.

Notes:
¹ With Visual C++ you can use and via a forced include of <iso646.h>.

Comments