L
Lorn
I was hoping some of you might be able to help me understand the
following code defining a typedef for a time query to the WINAPI. I
understand the basics of what's going on, I just don't understand the
format of the output it should provide. Here is the code below are some
more comments:
typedef struct tagGTime32
{
union
{
struct
{
unsigned char chSec100;
unsigned char chSec;
unsigned char chMin;
unsigned char chHour;
};
unsigned long dwTime;
};
}GTime32;
#pragma pack()
#define MAKEWTIME(h, m, s) ((unsigned long)(\
((unsigned char)(0)) << 0 |\
((unsigned char)(s)) << 8 |\
((unsigned char)(m)) << 16 |\
((unsigned char)(h)) << 24))
#ifdef __cplusplus
extern "C"
{
#endif//__cplusplus
int WINAPI GetGTime32Seconds(const GTime32 *gt);
int WINAPI DiffGTime32Seconds(const GTime32 *gt1, const GTime32 *gt2);
int WINAPI GetGTime32Minutes(const GTime32 *gt);
int WINAPI DiffGTime32Minutes(const GTime32 *gt1, const GTime32 *gt2);
#ifdef __cplusplus
}
When I make a call to a class which uses this def to store the time of
a particular event under a record (*pRcd), I get a value returned that
I'm confused about. I'm defining the trace output as an integer ("%d",
pRcd->gtime) and am returned a 9 digit number. However, if it were time
measured in miliseconds, it should be a max of 8... which leads me to
believe I don't really understand what's going on. If anyone could help
explain what I'm missing, I would be very grateful.
Regards,
Lorn
following code defining a typedef for a time query to the WINAPI. I
understand the basics of what's going on, I just don't understand the
format of the output it should provide. Here is the code below are some
more comments:
typedef struct tagGTime32
{
union
{
struct
{
unsigned char chSec100;
unsigned char chSec;
unsigned char chMin;
unsigned char chHour;
};
unsigned long dwTime;
};
}GTime32;
#pragma pack()
#define MAKEWTIME(h, m, s) ((unsigned long)(\
((unsigned char)(0)) << 0 |\
((unsigned char)(s)) << 8 |\
((unsigned char)(m)) << 16 |\
((unsigned char)(h)) << 24))
#ifdef __cplusplus
extern "C"
{
#endif//__cplusplus
int WINAPI GetGTime32Seconds(const GTime32 *gt);
int WINAPI DiffGTime32Seconds(const GTime32 *gt1, const GTime32 *gt2);
int WINAPI GetGTime32Minutes(const GTime32 *gt);
int WINAPI DiffGTime32Minutes(const GTime32 *gt1, const GTime32 *gt2);
#ifdef __cplusplus
}
When I make a call to a class which uses this def to store the time of
a particular event under a record (*pRcd), I get a value returned that
I'm confused about. I'm defining the trace output as an integer ("%d",
pRcd->gtime) and am returned a 9 digit number. However, if it were time
measured in miliseconds, it should be a max of 8... which leads me to
believe I don't really understand what's going on. If anyone could help
explain what I'm missing, I would be very grateful.
Regards,
Lorn