P
pete
And I have no idea what you mean by that,
Me neither.
I'll consider the matter more carefully.
And I have no idea what you mean by that,
Arthur said:However, re: Dan's reply: getchar() returns either a 'char'
value, cast to 'int', or it returns EOF, which is a negative 'int'
value unequal to any 'char' value. Right?
My #definitions above provide exactly enough numbers to do this
job: the range of 'char', which is signed, goes from -32767 to 32767,
and EOF is the 'int' value -32768. So if you were talking only about
the "EOF-and-getchar issue," you were wrong, AFAICT.
I think
I'm really going to need the C&V here, or you're going to have to
show me a piece of code that pokes large holes in my #definitions.
On my system the return value of putchar(EOF) is greater than -1.
I think it's supposed to be greater than -1,
and I don't think that it can be, on your system.
#include <stdio.h>
int main(void) {
int r = putchar(EOF);
if (r > -1)
puts("putchar(EOF) is greater than -1 on my system.");
return 0;
}
Dan Pop said:In <[email protected]> "Arthur J.O'Dwyer said:Nor do I. And even though I at first thought it was technically
wrong because of padding bits, I now think that while it still may be
wrong, it's less wrong than I thought.
a) Plain char is unsigned. INT_MAX must be at least UCHAR_MAX so that
getchar() can return any plain char value, and INT_MIN must be less than
or equal to -32767. So the total number of values of 'int' must be at
least UCHAR_MAX+32768, which requires more bits than CHAR_BIT. Q.E.D.
b) Plain char is signed. The range of char, i.e., of signed char, must
be a subrange of the range of int. But is it possible we might have
The properties of plain char don't matter, it is unsigned char that
matters.
#define CHAR_BIT 16
#define UCHAR_MAX 65535
#define SCHAR_MIN -32767 /* !!! */
#define SCHAR_MAX 32767
#define INT_MIN (-32767-1) [corrected]
#define INT_MAX 32767
#define EOF (-32767-1) [corrected]
Is anything wrong, from the C standpoint, with these definitions?
Yes, for a hosted implementation: int cannot represent the whole range
of unsigned char.
However, re: Dan's reply: getchar() returns either a 'char'
value, cast to 'int', or it returns EOF, which is a negative 'int'
value unequal to any 'char' value. Right?
<snip>
Would anything be wrong with intentional under-using the potential range
#define CHAR_BIT 1024
#define UCHAR_MAX 255
#define SCHAR_MIN -127
... etc ...
#define <under-used limits for int, long ...>
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.