I can only recommend this macro (assuming 1 and 2 put it in std and 3 and
4 don't):
#if defined(COMPILER_ONE) || defined(COMPILER_TWO)
Yuck - That's fragile, and will break as soon as Compiler Two stops putting
non-standard functions in std.
#else
#error "Compiler unknown"
Even worse! Since when is an unknown compiler an error?
A better approach - which, contrary to the "no way" assertion above, is in
common use in thousands of applications - is to use autoconf.
In a nutshell, you write a shell script that tries to compile and run a tiny
test app that uses the function you're testing for. If the test app builds
OK and produces the expected output, the script returns a zero, otherwise a
non-zero.
Autoconf produces a "configure" script - you've probably run many of them
without ever worrying about how they were produced. Among other things, this
script takes a config.h.in file as input, runs a series of test scripts as
described above, and produces a config.h header with various macros either
included or commented, to reflect the results of the tests.
For instance, your config.h.in might contain:
#define HAVE_MEMICMP
This would either be copied as-is to config.h, or commented out, depending on
the result of the check for memicmp().
The autoconf approach is *far* less fragile than checking for specific OS and/
or compiler versions, and stands a much better chance of success on an unknown
OS and/or compiler.
Have a look here for more:
said:
BTW, there are no standard functions 'memicmp' and 'stricmp'
All the more reason to check for it at compile-time. Instead of making code
that's dependent on non-standard functions, it can take advantage of them if
they're available, or use more portable standards-based alternatives if not.
sherm--