Sorry, I'm getting tired of flames in my usual haunts. So, of course,
I feel compelled to solicit your flames here. Sometimes you just can't
win.
Keith Thompson wrote:
Why do you need ints shorter than a word? They're inefficient, and I
don't think anyone worries about memory usage anymore with simple
variables.
Sixteen-bit quantities are useful when you have lots of them. Some
programmers working outside of Microsoft are aware of this. And, hey,
you can't always rely on the optimizer.
The simple way:
#define int long long
Yes, you can redefine fundamental types - I always like
#define char unsigned char
so that char arithmetic works.
That's bound to confuse people who study your code and run into
constructs that violate the assumptions of the signed char type.
I fell into the habit of defining my types explicitly in the top-level
header for arbitrary applications: s16 for short int, u64 for unsigned
long long, and so on. Generally, I tried to ensure that I was aware of
the size of a type as I was writing code, since I figure boundary
conditions bite harder than I want to be bitten. Assuring I was
getting what I asked for is another matter, and I can only say that I
have not had the chance to port much of my code to different
architectures, so I can only say that I was preparing my software for
that eventuality.
There has been a lot of talk about the virtues of portability, on
abstracting language syntax from hardware dependencies, and I simply
cannot fathom why it is that programmers wish to remain entirely
insulated from the world their programs run in. I guess I'm just a
heretic, but as much as I like C, there are just some things that I
don't want the language doing with my code. I want to know what my
code looks like once it is compiled into assembly, I want my language
rigidly defined (operator overloading sucks), I want my types fixed
across all platforms, and in general, greater control over the language
features. printf() and all the rest should have never been made part
of the spec.
I think there should, in fact, be two specifications. One definition
of the grammar and syntax, and another for the features pertaining to
the interface with an actual system (i.e. like POSIX). Since we're
talking about types, I suggest it could be intelligent to abstract the
concept of types to a generalization that fits the primary language
specification. Thus, the inherent properties of numeric integer
quantities should be considered necessary concepts and embodied in the
grammar and syntax. And so on. As such, C takes too much for granted.
I realise that most people do not write software that absolutely must
work every time, and all the time. However, I seen to have this
curious obsessive compultion to assure that my software isn't going to
turn around and kill me at some indefinite point in the future.
Perhaps I'm wierd.
At any rate, inferior languages to C (and don't get me wrong; I like C
a lot) proliferate because people simply don't seem to want durability.
Regards,
Steve