... I would like to know [why]
double* p = malloc(8*sizeof(double));==>OK for some here because
implicit coversion is well enough
but why
double* p = (double*) malloc(8*sizeof(double));==>NOK for some.
It is not so much a "not OK" as "causes more problems than it solves".
Consider, for instance, a (rather weak) analogy. Suppose that for
some reason you have an urgent need to travel 200 miles (or 300 km
in Europe
) in about an hour. You can:
- drive your car at 200 mph (300 kph), or
- take a 500-mph plane, giving you about 30 minutes to get on
and off the plane at the two ends of the travel.
There are tradeoffs here, and neither solution is absolutely perfect
-- but one is statistically quite a bit more risky than the other.
Perhaps driving at 200 mph kills 17% of the people who do it, while
flying at 500 mph kills 0.0034% of people who do it.
In *my* experience -- which dates back to 1981, when I first
started using C -- it is "casting malloc" that is like "driving 200
mph". *Many* more programs that have a cast have been discovered
to be broken, yet produced no compile-time diagnostics at all.
Specifically, I have seen at least dozens, if not hundreds (I never
kept a count), of cases of code that failed at runtime due to casts
hiding a missing "#include <stdlib.h>" or other declaration of
malloc(). In the compilers we used in the 1980s (including VAX
PCC, Plauger's own Whitesmiths C, VAX/VMS C, and a number of C
compilers for IBM PCs), malloc() had type:
char *malloc();
and casts were required for all situations except "assigning the
result to a variable of type `char *'". Without the cast, you got
a diagnostic, contrary to what Dennis Ritchie's earliest C compilers
did. (Apparently, in the Bad Old Days of:
int i 3; /* initialize i to 3 */
and
printf(2, "error\n"); /* write to file descriptor 2 */
no diagnostics occurred for:
int malloc(); char *p; ... p = malloc(N);
I never used these compilers myself -- by 1981, it was apparently
clear to C compiler writers that this was a bad idea.)
Accordingly, in "K&R-1 C" -- i.e., C as described by the original
White Book and implemented by compilers like PCC and VAX/VMS C --
one usually had to cast malloc(). As a result, if one failed to
declare malloc(), it acquired the "implicit int" declaration. C
being C, writing:
int malloc(); /* implicit */
double *p = (double *)malloc(N * sizeof *p);
normally compiled warning-free. On many machines, it even worked --
but on some machines it failed. If you declared malloc correctly,
either by hand or by including some header file (stdlib.h did not
exist yet and the name and even existence of this header varied),
the code would work on those other machines too.
In December 1989, the ANSI X3J11 committee finalized the original
ANSI C standard; and in this C, the type of malloc() had changed
from:
char *malloc();
to:
void *malloc(size_t);
Moreover, "void *" (which had been a useless type that even caused
some compilers to core dump!) now existed, and had a special
assignment-conversion property. Because of this property, it was
now possible to remove the casts from old code, and create new code
without casts:
#include <stdlib.h>
...
double *p = malloc(N * sizeof *p);
Such code immediately drew a "diagnostic" -- always a "warning" in
every compiler I used in the 1990s -- if and only if you had
forgotten the "#include" (or a programmer-supplied declaration,
but if you had a Standard C compiler, including the standard header
was the obvious best choice).
In particular, this meant that programs could now be written such
that a VERY COMMON MISTAKE was found at compile time. And -- as
I said earlier -- I subsequently observed what I believe to be
thousands of potential runtime failures, and least dozens of actual
runtime failures, prevented in code in which the (existing) casts
were *removed* and the warning diagnostic fixed by adding the
missing "#include" line.
Had those casts been left in, or had new code been written using
those casts, I believe much more code would have eventually failed
during porting or testing or (most expensively) after distribution.
This leads to what *I* believe is an inescapable conclusion:
IF ONE WRITES IN ANSI C89 (or ISO C90), casting malloc
is not necessary, and adds HUGE amounts of risk.
There are some who argue (see other postings in this thread,
particularly those by Tak-Shing Chan) that, even in strict C89,
there are possible situation in which casting malloc also removes
some risk. But in the the last decade, while I have personally
observed huge numbers of of "failure to include <stdlib.h>" errors,
including dozens of cases in which real, in-use code DID fail
because of this combined with an unnecessary cast, I have never --
NOT ONCE -- seen a situation in which real, in-use code failed in
a way that would have been diagnosed at compile time had a cast
been used. In other words, we might restate the above as:
In C89, casting the result of malloc() has proven over time to
add significant risk of error, and has a theoretical chance of
reducing error. Statistically, code that does cast malloc()
contains many more errors than code that does not.
The only sensible conclusion to the above is that one should not
cast malloc()'s return value in this case -- only some sort of
external constraint (such as "I am not always using C89") could
change this.
Now, in C99, that huge risk -- the failure to declare malloc()
using "#include <stdlib.h>" -- has changed from a frequently-undiagnosed
error to a required-diagnostic error. In other languages that
vaguely resemble C, it has always required a diagnostic. Thus, in
these non-C89 languages, the biggest risk in casting malloc() has
been removed. So we can add:
IF ONE WRITES ONLY IN C99 (or some other non-C language),
casting the result of malloc() adds little if any risk of
error.
Of course, no one yet has a decade of experience in "time-tested"
or "proven" C99 risk-assessment -- the new standard has been in
existence for less than four years, and few compilers implement it
even today. (Whether you believe my own "time-tested, proven" C89
risk-assessment above is up to you, of course.)
In any case, each programmer (or person directing programmers and
thus setting standards for them) must make his own risk/reward
analysis. The risks of casting malloc() in C89 are, I think,
well-established and significant. The rewards are, I think,
nonexistent in practice -- the theoretical situations in which it
helps never occur in real code. The risks in C99 are far smaller;
and if you have additional constraints, or believe you will never
use a C89 compiler that lacks a "warn on implicit int" diagnostic,
your own risk may be smaller than that in "generic C89". In this
case, whether to cast becomes purely a style issue:
While there are many *wrong* styles, there is no single
*right* style.
My own style will continue to be "do not cast", because the casts
are still unnecessary (in C -- I am not writing C++ here and when
I write C++ I almost never use malloc()), and I find that removing
all unnecessary source code produces programs that are smaller,
faster, more comprehensible, and/or more maintainable. But just
as there are English-language styles in which one does not "omit
needless words" (see Strunk&White), there are such code styles too.