I came across this piece of code
var timeStamp = 1 * new Date();
What's happening under the hood?
The short version:
Because it's being used in a math operation, the date is converted to a number, and when you convert dates to numbers, the number you get is the milliseconds-since-the-Epoch (e.g.,
The long version:
For objects like
Dates, that calls the abstract operation
ToPrimitive on the object, with the "preferred type" being "number".
For most types of objects (including
ToPrimitive calls the abstract operation
[[DefaultValue]], passing along the preferred type as the "hint".
[[DefaultValue]] with hint = "number" calls
valueOf on the object. (
valueOf is a real method, unlike the abstract operations above.)
valueOf returns the "time value," the value you get from
Side note: There's no reason I can think of to use
var timeStamp = 1 * new Date() rather than, say,
var timeStamp = +new Date(), which has the same effect. Or of course, on any modern engine (and the shim is trivial),
var timeStamp = Date.now() (more on