(e-mail address removed) (Dan Pop) wrote in message >
You need a 16-bit counter for microseconds, a 16-bit counter for
milliseconds and a 32-bit counter for seconds. There are only a few bits
wasted in the first two counters (which only use 10 bits out of 16).
Dan
After some soul searching, I've come up with this new clock
module (it only works for big-endian machines)
/* 4 millisecond increments */
#define GCLK_INC ((unsigned short)((double)(4.0) *
((double)65536.0/1000.0)) + 0.5)
/* use this type in your code */
typedef unsigned long GCLK_T;
/* Do not use this type in your code. It's only used internally in
gclk.c */
typedef union {
struct {
unsigned long secs;
unsigned short fractSecs; /* 1/65536 second */
} inc;
struct {
unsigned char unused;
GCLK_T ticks; /* upper 3 bytes are seconds. lower
byte 1/256 seconds */
} gclk;
} GCLK_LOW_LEVEL_T;
GCLK_LOW_LEVEL_T gcGClk;
#define GCLKIsr() do { \
gzGClk.inc.fractSecs += GCLK_INC; \
if (gzGClk.inc.fractSecs < GCLK_INC) { \
++gzGClk.inc.secs; \
} \
} while(0)
And everybody compares to gzGClk.gclk.ticks for time
references. that gives 24-bit seconds and 8-bit 1/65536 seconds.
This should work much better now. No? And the time measurement
should be linear up to overflow (in 198 days).
Andy