As expected, my posting elicited flames. (Also as expected, the
flames generated much heat, but little light.)
First, detractors willfully misconstrued my message. I wasn't
advocating that programmers ignore errors, or deliberately
refrain, as a general rule, from inspecting malloc()'s return
value for NULL (though I did mention, correctly, that that was
the *simplest* way to solve OP's problem). Rather I was trying
to encourage programmers to escape from following dogma mindlessly.
(Obviously checking malloc()'s return is benign compared with
much thoughtless adherence to dogma. I once saw a project
with page after page of purposeless defines like
#define SIMU_TOOLNUM_CODE QMIS_TOOLNUM_CODE
#define SIMU_TOOLTYPE_CODE QMIS_TOOLTYPE_CODE
#define SIMU_MACHTYPE_CODE QMIS_MACHTYPE_CODE
BTW, the punchline on this project, believe it or not, was
that the programming company asked for an additional $1 Million
when customer wanted to port the project to a C Compiler from
the Itsy Bitsy Mechanisms Corp. which couldn't handle long
macro names!)
Certainly the idea of "Checking for errors" sounds logical,
but do you really test for zero before *every* division?
Or, for an absurder example, since fprintf() can fail, do you
always check its return code? That would be the reductio
ad absurdem of an insistence on checking malloc(), especially
given the frequent contexts where malloc() *won't* fail, or
where, failing, a core-dump would be as good a diagnostic as any.
Let's see ... perhaps you malloc-checking pedants also want
to write:
if (fprintf(stderr, "Greetings galaxy\n") < 0) {
fprintf(stderr, "fprintf() failed\n");
}
Or should that be something more like
if ((cnt = fprintf(stderr, "Greetings galaxy\n")) < 0) {
while (fprintf(stderr, "fprintf() failed\n") < 0) {
fprintf(special_err, "fprintf() failed again\n");
}
} else if (cnt != strlen("Greetings galaxy\n")) {
while (fprintf(stderr, "Unexpected strlen mismatch\n") < 0)
{
fprintf(special_err, "fprintf() unexpectedly
failed\n");
}
}
Hey folks! Have a chuckle before you click on Flame-Reply
Michael said:
James Dow Allen said:
The dogma seems particularly silly in cases like [OP's] where
referencing the invalid pointer achieves *precisely* what you
want: a core dump, with visible call stack, etc.
Maybe. Remember that dereferencing an invalid pointer invokes
undefined behavior.
Well, thanks, Keith for foregoing a sarcasm like
... the compiler is allowed to generate code that will
trash your hard disk, or send all your passwords to
Tasmania via carrier pigeon.
(FWIW, if the dereference-then-trash-disk option were in
widespread use I daresay everyone in this NG, including
those who religiously test malloc()'s return, would have
lost several disks by now.)
Now I realize some of you write code for computers that
run the Chatanooga Time Sharing System, on hardware which
uses Dolly Parton's phone number as the bit pattern for
NULL pointers, and which fry your power supply whenever
you dereference Miss Parton. Even worse, some of you
write code to run under MS Windows.
But I, and many other programmers of good taste, have the
luxury that 90% of our code will *never* run on systems
other than Unix. And, UIAM, *every* version of Unix that
uses hardware memory management will dump core whenever
an *application* writes to *(NULL).
Someone is thinking:
A good programmer should be able to write non-Unix
applications or operating systems, and even to write
device drivers for brain-dead OS'es.
Been there, done that, probably before many c.l.c denizens
were born. A good surgeon should be able to do
appendectomies without anesthetic, but only as a last
resort. My life is organized well enough that I shan't
have to resort to non-Unix OS'es.
You don't care but I'll tell you anyway
*Every* virtual memory Unix I can recall will
dump-core on writing to *any* address from (0) to (0 + X).
(X is *over Two Billion* in a typical environment.)
[OT] Not if process limits are set properly. In a well-administered
system, or even one where the user has a modicum of sense, the
process will be running with a reasonable data-size limit ...
Is this comment directed against visitations by a runaway
malloc() (ie, the case where programmer neglects to free()
unused memory, or to limit runaway table growth)? I hope
it doesn't sound like bragging but it's very rare that my
programs grow memory uncontrolledly. I'll give Michael the
benefit of the doubt and assumes he's suggesting some other
purpose for setrlimit().
Throttling memory allocation is sometimes appropriate.
An example would be the hash table design I discussed in a
recent post in comp.programming:
http://groups.google.com/group/comp.programming/msg/b3a0b7c25680d2d1
where caching heuristics
change when a soft memory limit is reached.
(BTW, I *did* win the programming contest where this
caching heuristic came into use:
http://www.recmath.org/contest/PrimeSquares/standings.php )
Obviously a *hard* setrlimit() cannot be employed for throttling
except in a very simple program -- in the contest-winning example,
the cache may continue to grow after the *soft* limit is
reached, just at a lower rate. In fact, the proper approach
to memory throttling will often be the simplest: rely strictly
on *soft* limits (either as provided by Linux setrlimit()
or by simply tracking memory allocation) with any "need" for
a *hard* limit obviated by careful high-level design.
(During the contest I sometimes stopped one of two caching
processes in order to run Acrobat in another window without
thrashing: setrlimit() would have had no value there.)
Throttling caching parameters with a *soft* setrlimit() would
be a valid example where checking malloc()'s return code is
absolutely necessary, though I don't think that is what Michael
was suggesting. There are many cases where checking malloc()
is right, and of course it's never wrong except in Quixotic
examples like the *simplest* solution for OP's problem.
I never intended to suggest otherwise. (Even *I* almost
always check malloc()'s return, if only by using the cover
mustmalloc(), but I do it primarily because it "feels good" --
like proper arrangement of white space -- rather than based
on any superstition that that it has an important effect.)
My real point was, *Don't Be So Dogmatic*. The poster who
sarcastically suggested that tires should be checked *and*
seatbelts fastened isn't *wrong* of course, but must not live
in the real world, because programs like IE_Explorer are
riddled with *unchecked* errors. A better analogy would
have been the driver so engrossed in polishing the door
handle that he forgets to check the tires or oil.
Not all code is delivered. Not all code will be ported to bizarre
hardware or operating systems. Often the *simplest* approach to
debugging is to let the machine core-dump: this gives more
information than you'll ever get with printf().
And, speaking of printf, ... and since "repetition is the soul
of Usenet", be aware that the dogma
"malloc() might fail, so its return code must be checked"
leads to the reductio ad absurdem that every printf()
should be checked for error!
Still keeping my asbestos suit on
James D. Allen