Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the proper way to use clock_gettime()?

Tags:

c

time.h

I was trying out this function in a C program, and it keeps printing the wrong time. This is my code at the moment:

#include <stdio.h>
#include <unistd.h>
#include <time.h>
#include <sys/resource.h>

int main( int argc, char **argv ){
    struct timespec start, finish;
    clock_gettime( CLOCK_REALTIME, &start );
    sleep( 1 );
    clock_gettime( CLOCK_REALTIME, &finish );
    printf( "%f\n", ((double) (finish.tv_nsec - start.tv_nsec))/((double) 100000) );
    return 0;
}

I'm not sure if this is an anomaly caused by rounding errors when converting to a double or if I'm using the clock_gettime() function incorrectly, but I expected it to output 1 second and instead it outputs 1.27 seconds.

like image 720
Zen Hacker Avatar asked Nov 17 '25 03:11

Zen Hacker


1 Answers

You need to take into account the tv_sec member of the structure when calculating the time difference between two values returned by clock_gettime().

The tv_nsec is the number of nanoseconds within the current second. It ranges (in theory) between 0 and 999,999,999. This allows an integer number of whole seconds to be stored in tv_sec and a fraction of a second to be stored in tv_nsec. Actual resolution is another issue: see clock_getres() for that. On a Mac, for instance, the resolution is microseconds, even though those are expressed in nanoseconds.

Consider using code like this:

#include <stdio.h>
#include <time.h>
#include <unistd.h>

enum { NS_PER_SECOND = 1000000000 };

void sub_timespec(struct timespec t1, struct timespec t2, struct timespec *td)
{
    td->tv_nsec = t2.tv_nsec - t1.tv_nsec;
    td->tv_sec  = t2.tv_sec - t1.tv_sec;
    if (td->tv_sec > 0 && td->tv_nsec < 0)
    {
        td->tv_nsec += NS_PER_SECOND;
        td->tv_sec--;
    }
    else if (td->tv_sec < 0 && td->tv_nsec > 0)
    {
        td->tv_nsec -= NS_PER_SECOND;
        td->tv_sec++;
    }
}

int main(void)
{
    struct timespec start, finish, delta;
    clock_gettime(CLOCK_REALTIME, &start);
    sleep(1);
    clock_gettime(CLOCK_REALTIME, &finish);
    sub_timespec(start, finish, &delta);
    printf("%d.%.9ld\n", (int)delta.tv_sec, delta.tv_nsec);
    return 0;
}

When run (as cgt61), I get results like:

$ cgt61
1.004930000
$ cgt61
1.004625000
$ cgt61
1.003023000
$ cgt61
1.003343000
$

This was tested on a Mac; you can see that the final three digits are always zeros. In a Linux VM (Ubuntu 18.04 on a Mac), I had to add #define _POSIX_C_SOURCE 200809L to the code (because I compile with -std=c11; if I used -std=gnu11, I would have been OK), and the output was:

$ ./cgt61
1.000589528
$
like image 121
Jonathan Leffler Avatar answered Nov 18 '25 18:11

Jonathan Leffler