T
TK
I have encountered a very strange bug. I am using a variable defined
as:
char ch;
If I assigned a character like 'Ö' (ascii 153) with a statement like:
ch = 'Ö';
It gets the value -103. It's as if char now is a signed datatype,
which it has never been before. I am developing in MS DevStudio 6.0.
Is there a setting somewhere that could have been changed? I have
never seen this before, and I have coded for about 10 years. I have
even run the code I have problems with now before without this problem
(most recently just a few days ago). Can anything else I have
installed screwed up this? When researching the problem on the web, I
did find some references to a file called limits.h that deals with
this, and it does exist on my computer, but I have not added any
referencees to it my code.
Any help is much appreciated,
Thanks,
TK
as:
char ch;
If I assigned a character like 'Ö' (ascii 153) with a statement like:
ch = 'Ö';
It gets the value -103. It's as if char now is a signed datatype,
which it has never been before. I am developing in MS DevStudio 6.0.
Is there a setting somewhere that could have been changed? I have
never seen this before, and I have coded for about 10 years. I have
even run the code I have problems with now before without this problem
(most recently just a few days ago). Can anything else I have
installed screwed up this? When researching the problem on the web, I
did find some references to a file called limits.h that deals with
this, and it does exist on my computer, but I have not added any
referencees to it my code.
Any help is much appreciated,
Thanks,
TK