G
Grant Edwards
Is there a timer chip that is programmed to count at exactly
1us steps?
No, but the value returned by gettimeofday() is an long integer
that counts seconds along a long integer that counts
microseconds. The resolution of the data seen by Python's time
module is 1us.
The underlying hardware has a much finer resolution (as shown
by the clock_gettimer call), but the resolution of the system
call used by Python's time module on Unix is exactly 1us.
If this is trying to be platform independent, I think it has
to be faking it sometimes. E.g., I thought on windows you
could sometimes get a time based on a pentium time stamp
counter, which gets 64 bits with a RDTSC instruction and
counts at full CPU clock rate
I assume that's what the underlying Linux system call is doing
(I haven't looked). Then it just rounds/truncates to the
nearest microsecond (because that's what the BSD/SysV/Posix API
specifies) when it returns the answer that Python sees.