I almost prefaced my message wtih a disclaimer
that SPEED WAS **NOT** MY CONCERN, but felt that a
simple speed-up case led to a clear example.
It seemed unnecessary -- I mention the point when
I mention speed.
But the only one to make a joke about the speed-up
was also the only one who grasped that speed
performance was not my primary concern!
Aside: Did you suspect that, although any speedup was peripheral
to my main point, I actually did the implied experiment,
using glibc's qsort source, with constanted num, size,
compare? The element was
struct { int a, key, c, d; }
Speedup was 25%.
One example is hash-table management. If space-efficiency
is important, low-level details essential to the table
handling will vary, while higher-level functionality
(collision handling, resizing) may be invariant.
If this isn't clear, note that a space-efficient table
may pack key AND payload into 2 or 3 bytes, while
off-the-shelf hashtables often dictate 12 bytes PLUS key.
Can't comment, as there's a great lack of specificity
about the designs of these competing hash tables. Separate
chaining, coalesced chaining, open-addressed, ... The phrase
"hash table" covers a *wide* variety of data structures with
disparate characteristics.
Besides, the possible space efficiency seems insufficient
to justify "CAN'T BE USED AT ALL;" it's entirely akin to the
time savings you already don't care about.
I'm assuming the user is an expert programmer.
s/expert/infallible/, or so it seems. Speaking for myself,
I have a hard time imagining the mind behind that level of
perfection -- except to think that such a mind wouldn't bother
with crutches like pre-written libraries, but would write all he
needed anew each time, omitting the generalities and such that
are inapplicable to the immediate case at hand. See also
http://www.pbm.com/~lindahl/mel.html
Please note that there are two quite distinct approaches:
(1) common source is in header, often in the form of
#define'd code fragments, and invoked by application
via macros.
(2) common source is in .c file, application-specific
stuff is presented to .c via user-specific .h defines.
The .c is compiled (with its user-specific header) and
then behaves just as an ordinary .c/.o.
I'm thinking of (2) specifically, but Eric's comments
apply mostly to (1).
Sorry; I fail to see any difference between the two "quite
distinct" approaches. Both employ the preprocessor to make
case-specific alterations to a batch of source code; it's just
a matter of the packaging, not of the essence.
James, I'm not trying to make fun of you. The technique of
using the preprocessor to specialize general code to a particular
use is fairly well-known -- yet the plain fact is that it's not
widely and routinely used. Considering the efficiencies it may
offer, one wonders why not? And my answers (or speculations) are
about the added difficulties the approach brings, and about the
choices experienced developers make. Occasionally, very occasionally,
the advantages are appealing enough that someone chooses to go this
route. Mostly, though, they don't -- So you're left with a choice
between "They're all dimwits" and "They've considered other factors."
It's possible that the former is the case, but "Everyone's out of
step except Johnny" is a difficult position to sustain ...