ok, but i write in C too, when i use timer with C (SetTimer)...
The actual function for the timer, as defined by the GNU C library,
uses clock() to return the current time in milliseconds. However, I've
found that function, along with the Windows API's GetTickCount() are
only accurate to 16 milliseconds. When I need a high accuracy timer
(on Windows only, but since you are using C#, you are probably using
Windows anyway), I usually use Windows API's GetPerformanceFrequency()
to determine the CPU speed, then use GetPerformanceCounter() (or maybe
rdtsc...) to get the CPU time stamp, and divide the latter by the
former. That seems to work very well, except you need to do 64 bit
arithmetic or convert the value to double using shift-add.