Q
Quentin Pope
By the standards of a desktop PC, these files are relatively tiny: about
3000 lines on lccwin32, and 5000 lines on mingw. By comparison, the
standard Windows headers (excluding optional headers!) are 18000 lines
and 26000 lines respectively. Plus any headers relevant to the
application...
C programs may run on some very small systems, but you would expect a
development host to be reasonably respectable in computer-power.
Although it is true that the headers can contain a lot of long-winded
stuff like this, the longest header line in mingw:
_CRTALIAS long __cdecl __MINGW_NOTHROW _wfindfirsti64 (const wchar_t*
_v1, struct _wfinddatai64_t* _v2) { return(_wfindfirst32i64 (_v1,(struct
_wfinddata32i64_t*)_v2)); }
It seems to be just the nature of the language that it has to partly
define itself by reams of these tedious definitions, and written in an
inefficient text format too.
But even with the headers as they are now, with the highly-advanced
compilers that everyone is always on about, is there really no
alternative to having to read in exactly the same headers, for every
single compilation, over and over again?
Only where thousands of very small source files are used, which itself
is highly inefficient.
But if it is a problem, then there is a very simple solution: replace
stdheaders.h with just the headers that are needed.
Yes, for 1972! Or maybe for an 8-bit system with 64KB.
Sorry, but this is a complete fallacy.
I recently had cause to compile Fire Fox from source, and I only *just*
had enough RAM for the linking phase.
Computers may have become more powerful, but the bloat of software
packages has more than kept pace with it.
And why do you think it's OK to exclude people on an 8-bit system with
64KB anyway?