F
Frank
This reading is starting to get really interesting. I'm so glad I
bought a currently-valid reference.
A lot of little things pop up. On p. 222 is the following snippet:
size_t z = sizeof(j++);
size_t x = sizeof(int [n++]);
The authors claim that the former statement does not increment j,
while the latter does increment n. Why?
The other one I can't find, so I'll try to produce it faithfully.
They claim that
char c;
while ((c = getchar()) != EOF)
....
is not the right way to do it. Instead they say to use
int c;
....
Supposedly, on most machines, the test condition would become 255 ==
-1. I've written a boatload of programs that would never have exited
the while loop. The rationale for this had to do with the signed-ness
of char.
Can someone say a few words about this?
bought a currently-valid reference.
A lot of little things pop up. On p. 222 is the following snippet:
size_t z = sizeof(j++);
size_t x = sizeof(int [n++]);
The authors claim that the former statement does not increment j,
while the latter does increment n. Why?
The other one I can't find, so I'll try to produce it faithfully.
They claim that
char c;
while ((c = getchar()) != EOF)
....
is not the right way to do it. Instead they say to use
int c;
....
Supposedly, on most machines, the test condition would become 255 ==
-1. I've written a boatload of programs that would never have exited
the while loop. The rationale for this had to do with the signed-ness
of char.
Can someone say a few words about this?