I've just installed a rtos that runs on PCs called OnTime. I'm going
to do some experimenting, but don't you think that given enough time
the program will still continue to preform as it does on a general os?
Try increasing the size of the loops, and leave it running over night.
Do you think you could do that for me and tell me how it preforms? I
need to know if it works or not.
A 24 bit counter running at 1kHz will overflow after about 4 1/2 hours.
In my case the maximum error you can get is +/- 1 tick. I guess it ws
my luck that both runs produced the same result. The +/- 1 tick error
is due to the possibility of sampling the counter at in between
transition. Say for example you sample the counter just when it is
incrementing from1000 to 1001. In which case you have a 50% chance of
getting the either 1000 or 1001. But just like your Unix experiment
this says nothing about time travelling but more about sampling theory.
I can actually construct a set-up that can guarantee the same result
for every run simply by using the same clock source to drive both the
counter and the CPU. In which case the CPU is running in-sync with the
clock regardless of the acuracy of the clock source. Such a setup even
works if you keep varying the clock frequency because the CPU executes
instructions synchronously with the clock.
Think of it this way. If the CPU needs to execute exactly 100
instructions for each round of loop and each instruction executes in
exactly 2 clock cycles then each round of the loop will execute in
exactly 200 clock cycles. Now, when talking about 'clock' here we are
talking about the square wave used to drive the CPU. If we now use this
same square wave as the basis for the CPU to measure time then of
course the CPU will never disagree with its time measurement assuming
nothing else introduces delays or jitter to our instruction stream such
as interrupts.
If, like my experiment above, we use two different clock source: one to
drive the CPU and another to drive the counter then what you are
measuring is not "time travel" but simply the relative accuracy between
the square waveforms which can indeed be seen visually if the two
square waves are fed into an oscilloscope. In this case an error can
occur if you happen to sample the counter at a harmonic interval
between the two square waves:
(view using fixed width font or the alignment will be all wrong)
clockA 000111000111000111
clockB 00001111000011110000
^
|
if you happen to sample here
then you may get clockB's reading
as either 0 or 1
When interrupts come into play then the reading may be delayed by as
much time as it takes for the interrupt service routine to complete. So
if you're going to use OnTime's RTOS make sure you're not using
preemptive multitasking or time-sliced multitasking. And make sure
you're not using the real-time kernel. Use simple cooperative
multitasking and turn off all interrupts. Actually for the best result
use DOS and a DOS compiler like DJGPP.
Now finally, from a physical standpoint, what exactly *is* time
travelling? Your CPU? There is only one CPU, what is it travelling to
or away from, itself? This experiment does not show the CPU time
travelling but rather the software running on the CPU to be "time
travelling". In which case you need to understand that software is not
physical at all so all bets are off. Software is just like words coming
out of my mouth. If I say:
The quick brown fox jumps over the lazy dog.
and then later say:
The quick brown dog fox jumps over the lazy.
then did the word "dig" time travel in the second instance since it now
appears before the word "fox". Of course not. It is just how I decided
to utter the string of words. Just like how a CPU decides which
instruction to execute on a modern PC. On a modern PC groups of
instructions are scheduled pre-emptively with higher priority groups
being able to interrupt those with lower priority and instructions
themselves are often executed out of order.
You can conduct the same experiment like your code using a human
instead of a CPU. Ask your friend to say "The quick brown fox jumps
over the lazy dog" and measure the time between the word "fox" and
"dog". Each run will give out slightly different results not because
you measured time inaccurately, and not because the word "dog" time
travelled into the past or future, but because your friend takes
different amounts of time to utter the sentence with different length
of pauses between words and different things distracting him. This is
exactly what happens in a multitasking OS like Unix or Windows.