Douglas A. Gwyn said:
Only because external memory being cheaper than internal memory,
the problem showed up there first.
Surely by now you have encountered platforms with > 4GB RAM.
This trend was predictable, and we decided to deal with it
as soon as we saw it coming.
The issue of integer types and extended integer types has to be look ed
at from all sides.
Certain programmers wrote code that relied on long being the widest
integer type in C. A very large number of other programmers wrote code
that relied on long being 32-bits on the platforms they were using.
As time went by having the widest type restricted to 32-bits became
increasingly unacceptable to an ever increasing number of programmers.
Changing long to more than 32-bits breaks an assumed invariant in a
great deal of code. Breaking the invariant that long is the widest type
breaks a different body of code.
Now the argument is that the latter group were writing code that
conformed to their understanding of C89 (note that I really do not want
to waste time debating whether their understanding was correct, it is a
dry and entirely pointless argument) had right on their side whilst
those writing code where long was 32-bits should just accept that it
would not be.
But standards are about commerce (OK some are about safety but language
standards are of the former kind even if some would want it otherwise).
The commercial judgement was that it was more viable to allow types
longer that were wider than long.
The argument about long long is an entire red herring (actually I have
very strong reservations about any fundamental type having a multi-token
name, but that is a different issue). The fundamental issue is whether
the Standard should bless extended integer types wider than long to the
extent that they may be used via standard typedefs such as size_t and
off_t.
If such usage is an important issue for a larger enough minority of
customers then implementors will enhance their market share by avoiding
use of types wider than long for the standard typedefs.
If the minority is insufficient to apply commercial pressure then sadly
my response is 'tough' live with it. It is not the responsibility of a
Standard to protect the interest of such a minority at the expense of
the majority. This is neither an academic nor a political issue but a
commercial one.
BTW there is absolutely nothing under international treatise that
requires any NB to adopt an ISO/IEC standard as their own.
Also note that the overwhelming majority of C programmers do not use
their compilers in strictly conforming mode. I am sure that those
companies that have speciality markets (such as support for 8051 based
hardware) will continue to produce tools that meet those markets needs
(and as such ones that do not conform to any C Standard).
Yes, I want better standards but I also want standards that are
respected and largely used. That means that Standards do need to respond
to existing practice. I think that there are some areas of C99 that
would have, IMO, been better left as TRs (I think that much of the
complex number support might have been better left in a TR), and there
are some areas which could have used more work to provide a uniform
mechanism (I am thinking of such items as the type generic maths
support) but given the limited human resources available I think we did
a pretty good job with C99.
BTW IMHO a pre-requisite for criticising features of C99 (as opposed to
raising defect reports) should be having read the WG14 Rationale
document which explains many of the decisions. At least take time to
discover why, before objecting to the what.