Cesco Cesco - 1 month ago 8
Javascript Question

Undefined variables, value == false versus !value

I have a problem with a very simple piece of code written in Javascript, could you help me please?

Here's what I think I have understand so far about javascript and variables:


  • An undefined value is evaluated to false in a boolean operation

  • By using a == operator in a comparation, you're asking if two values are comparable regardless of their types



I found an exercise file in a online course and I tried to do it, but I didn't got the same result expected in the lesson; the main problem was that I was comparing the value through a "if value == false { ... }" while the solution was using a "if !value { ... }"

So I decided to write a very short code in order to try it by myself, but I'm getting mixed results. Here in the example below I would expect this JS code to generate two identical alerts ("foo is equal to false"), but instead the first if statement returns "foo IS NOT equal to false" while the second if returns (as expected) "foo is equal to false".

This is what I written:

var foo = undefined;

if (foo == false) {
alert("foo is equal to false");
} else {
alert("foo is not equal to false"); // Javascript executes this row
}

if (!foo) {
alert("foo is equal to false"); // Javascript executes this row
} else {
alert("foo is not equal to false");
}


AFAIK the two IFs should do the same work, and infact when I tried it by replacing in the first line the value "var foo = undefined;" with "var foo = 0;" it worked as expected, and 0 is another value that should be evaluated to false, or at least this is what I remember.

Could you tell me what I'm doing wrong?

Answer

The == algorithm (Abstract Equality Comparison Algorithm) isn't something where you can simply assume an outcome unless you know the algorithm. You need to know the details of how it works.

For example, null and undefined are a special case. They do not do any type conversion other than to be considered equal to each other.

Otherwise there's typically a type conversion that tries to reduce both operands to a common type. This often ends up being a toNumber conversion.

That's why:

  • null == undefined; // true

  • null == 0; // false

  • +null == '0' // true

So if you know how the algorithm works, you know that undefined never equals anything except for undefined and null, but other types that are not strictly equal may be coerced down to types that are equal.

So doing if(!x) vs if(x==false) are entirely different tests.

  • if(!x) performs toBoolean conversion.

  • if(x == false) uses a complex algorithm to decide the proper conversion.

So with...

if(x == false)

...if x is undefined, it is determined to not be equal to false, yet if x is 0 or even "0", it will be considered equal to false.

  • 0 == false; // true

  • "0" == false; // true