In said:
I do not know if dmr suggested that.
His own compiler used 32-bit longs on 32-bit hardware...
If he did, i can have my own opinion right?
Did anyone dispute your right to your own opinion? However, you expressed
it in terms of "we", so you should tell us who exactly is included in this
"we".
And i did not discuss popularity.
Yet, popularity does matter when the implementor has to decide the sizes
of the types. He could make int's 1024 bits and long's 16384 bits, but
chances are that few people will want to use his implementation...
Byte is 8 bit in all 64-bit systems?
I have yet to hear about one using another byte size. Even the
implementation for the original Cray processors had 8-bit bytes, despite
the lack of hardware support for them (the processor only supported 64-bit
word addressing).
My last info about Itanium when it was
being designed was that it would have 16 bit bytes (which is not prohibited
as you know).
Your last information about Itanium is pure bullshit. I was programming
on the Itanium back when it was called Merced and it only existed in
software emulation, so I should know. Its instruction set makes
implementations with 8, 16, 32 or 64-bit bytes perfectly possible, but
its architecture is based on 8-bit byte addressing.
If you want a processor with 16-bit bytes, have a look at the TMS-320C25,
but don't expect to find any hosted implementation for it.
Even in your case, long could have been 64 bit until 128 bit
systems become popular.
I never said it *couldn't*, I was merely talking about the current
situation, where people needed 64-bit support on 32-bit implementations
that *already* had 32-bit long's. The people maintaining these
implementations would not consider changing the size of long as an
acceptable solution (far too much existing code relied on long as a 32-bit
type), so the *only* acceptable solution was introducing a new type.
I entirely agree that, if C were invented today, long would be the right
type for 64-bit integers. But C was invented over 30 years ago and its
history does influence its current definition.
Yes me too. I also bet that we will face a "fragmentation problem" in the
future, that is not all language features will be implemented in most
(popular) implementations as is the case of C99 now. Who can really rely
that C99 code can be really portable? And 5 years have already passed...
C99 is far from being an industry standard and it is not yet clear whether
it will ever become one. Yet, it does exist and it does influence other
standards (the last Unix specification is based on it) and the C++
standardisation process seems to be moving toward adopting its new
features. Whether you (or I) like it or not.
Dan