Marcin 'Qrczak' Kowalczyk
qrczak at knm.org.pl
Wed Jan 26 13:34:44 EST 2005
Keean Schupke <k.schupke at imperial.ac.uk> writes:
> Yes timing between point in time should be independant of leap
> seconds, so if a program, takes 5 seconds to run it should always
> take 5 seconds, even if it runs accross midnight.
How would you implement it, given that gettimeofday() and other Unix
calls which return the current time either gradually slow down near a
leap second (if NTP is used to synchronize clocks) or the clock is not
adjusted at all, and at the next time it is set it will just be on
average one second later (if it is being set manually)?
If you assume that gettimeofday() returns UTC and you convert it to
TAI using a leap second table, more often than not you would actually
introduce a jump near a leap second - compensating for the extra
second in UTC which will not be observed in practice.
__("< Marcin Kowalczyk
\__/ qrczak at knm.org.pl
More information about the Libraries