Dan James Palmer Dan James Palmer - 1 year ago 92
Javascript Question

setInterval at 1ms doesn't seem to actually be 1ms

I'm trying to time how long a file takes to download using an HTTPRequest like so:

function getFile() {
'use strict';
var url = "data.bin";
var rawFile = new XMLHttpRequest();
var timer_var = setInterval( theTimer, 1 );

rawFile.open("GET", url, true);
rawFile.onreadystatechange = function () {
if(rawFile.readyState === XMLHttpRequest.DONE && rawFile.status === 200) {
toLog("Milliseconds for download: " + time_taken);

function theTimer() {
'use strict';

As you can see I have
every one millisecond. All
does is increment a variable, which in theory should have a value in milliseconds of how long the interval was running.

When the file has been downloaded I output the data, clear the timer and display the time in ms. However, the value doesn't add up. I should be almost 2 seconds but only stands at around 250ms.

Why isn't the
truly every 1ms?

Answer Source

From the docs:


The time, in milliseconds (thousandths of a second), the timer should delay in between executions of the specified function or code. If this parameter is less than 10, a value of 10 is used. Note that the actual delay may be longer;

(Mozilla, but other browsers seem to use similar values)

There is also a reference to the documentation of setTimeout mentioning reasons why "the actual delay may be longer".

In short:

  • There is a minimum delay (already in the spec) for nested timeouts
  • Inactive tabs may clamp timeouts to reduce load
  • The page/browser/OS might be busy with other tasks

In modern browsers there seems to be a way using window.postMessage.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download