J
JohnF
Anything wrong with that, i.e., with #define BYTES *8
to multiply by 8? It looked a little weird to me, but
the more obvious #define BYTES(x) ((x)*8) isn't what
I wanted to write. This was to express units of measurement,
e.g., int bits = so_many BYTES; rather than
int bits=BYTES(so_many); . I just wanted to read and write
it the first way, and my test program
#define BYTES *8
#include <stdio.h>
int main ( int argc, char *argv[] ) {
int bytes=(argc<2?1:atoi(argv[1])),
bits = bytes BYTES;
printf("%d bytes = %d bits\n",bytes,bits);
return(0); }
works fine (and compiles with no -pedantic warnings).
But that #define BYTES *8 still looks a little funky to me.
I realize 2+2 BYTES must be written (2+2)BYTES, but is there
any other kind of "gotcha"?
to multiply by 8? It looked a little weird to me, but
the more obvious #define BYTES(x) ((x)*8) isn't what
I wanted to write. This was to express units of measurement,
e.g., int bits = so_many BYTES; rather than
int bits=BYTES(so_many); . I just wanted to read and write
it the first way, and my test program
#define BYTES *8
#include <stdio.h>
int main ( int argc, char *argv[] ) {
int bytes=(argc<2?1:atoi(argv[1])),
bits = bytes BYTES;
printf("%d bytes = %d bits\n",bytes,bits);
return(0); }
works fine (and compiles with no -pedantic warnings).
But that #define BYTES *8 still looks a little funky to me.
I realize 2+2 BYTES must be written (2+2)BYTES, but is there
any other kind of "gotcha"?