Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

can setInterval drift over time?

I have 2 node.js webservers. I cache data inside webservers. I sync the cache load/clear based on system time. I have done time sync of all my hosts.

Now I clear cache every 15 mins using following code:

millisTillNexthour = "Calculate millis remaining until next hour"

setTimeout(function() {
  setInterval(function() {
    cache.clear();
  }, 60000*15);
}, millisTillNexthour);

My expectation is even if this process runs for ever, cache will be cleared every 15th minute of each hour of the day.

My question is: can setInterval drift over time?

For eg: right now it clears cache at 10:00 10:15 10:30 10:45 11:00 ......

Can it happen that instead of 10:15 system time, setInterval gets executed at 10:20 system time when it was supposed to clear cache at 10:15??

I am not sure how this works. Please shed some light. I hope I explained my question well.

like image 766
GJain Avatar asked Oct 21 '25 14:10

GJain


1 Answers

I'm probably more than a bit late to the party here, but this is how I solved this particular time-slipping problem just now, using a recursively called setTimeout() function instead of using setInterval().

var interval = 5000;
var adjustedInterval = interval;
var expectedCycleTime = 0;

function runAtInterval(){
    // get timestamp at very start of function call
    var now = Date.now();

    // log with time to show interval
    console.log(new Date().toISOString().replace(/T/, ' ').replace(/Z/, '') + " runAtInterval()");

    // set next expectedCycleTime and adjustedInterval
    if (expectedCycleTime == 0){
        expectedCycleTime = now + interval;
    }
    else {
        adjustedInterval = interval - (now - expectedCycleTime);
        expectedCycleTime += interval;
    }

    // function calls itself after delay of adjustedInterval
    setTimeout(function () {
        runAtInterval();
    }, adjustedInterval);
}

On each iteration, the function checks the actual execution time against the previously calculated expected time, and then deducts the difference from 'interval' to produce 'adjustedInterval'. This difference may be positive or negative, and the results show that actual execution times tend to oscillate around the 'true' value +/- ~5ms.

Either way, if you've got a task that is executing once a minute, and you run it for an entire day, using this function you can expect that - for the entire day - every single hour will have had 60 iterations happen. You won't have that occasional hour where you only got 59 results because eventually an entire minute had slipped.

like image 139
HomerPlata Avatar answered Oct 23 '25 03:10

HomerPlata



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!