samanime samanime - 1 month ago 3
Javascript Question

What does JavaScript interpret `+ +i` as?

An interesting thing I've never seen before was posted in another question. They had something like:

var i = + +1;


They thought the extra
+
converted it to a string, but they were simply adding to a string which is what caused it to convert.

This however lead me to the question: what is going on here?

I would have actually expected that to be a compiler error, but JavaScript (at least in Chrome) is just fine with it... it just basically does nothing.

I created a little JSFiddle to demonstrate: Demo

var i = 5;
var j = + +i;
document.body.innerHTML = i === j ? 'Same' : 'Different';


Anyone know what's actually occurring and what JavaScript is doing with this process?

I thought maybe it would treat it like
++i
, but doesn't increment, and you can even do it with a value (e.g.,
+ +5
), which you can't do with
++
(e.g.,
++5
is a reference error).

Spacing also doesn't affect it (e.g.,
+ + 1
and
+ +1
are the same).

My best guess is it's essentially treating them as positive/negative signs and putting them together. It looks like
1 == - -1
and
-1 == + -1
, but that is just so weird.

Is this just a quirky behavior, or is it documented in a standard somewhere?

Answer

Putting your the statement through the AST Explorer, we can see that what we get here is two nested Unary Expressions, with the unary + operator.

It's a unary expression consisting of + and +i, and +i is itself a unary expression consisting of + and i.

The unary expression with the unary + operator, will convert the expression portion into a number. So you're essentially converting i to a number, then converting the result of that to a number, again (which is a no-op).

For the sake of completion, it works on as many levels as you add:

var i = 5;
console.log(+ + + + + +i); // 5
console.log(i); // still 5