No one has posted a counter example, which strongly suggests that
there isn't one.
I've used embedded systems where it was ordinary RAM from location 0.
These were embedded systems designed with RAM at the bottom of the
memory map and ROM at the top because that was the simplest way to do
it. In such circumstances it would definitely make sense to have NULL as
being something other than all bits zero to avoid wasting a location.
But also helps understanding. If a newbie thinks that something is
likely which he will never in fact encounter, that causes confusions.
Humans are not natural lawyers who can learn a rule divorced from
practical examples.
To be a good programmer you have to learn to understand and work with
abstractions, which is all you have to think of NULL, the null pointer
constant and null pointers as being. They are all just ways of
representing a guaranteed invalid pointer value.
You also have to learn to accept that some things which are incorrect
will work 99.9% of the time, but the time they fail is almost certainly
going to be the worst possible time for you.
A lot of code is like that. For instance IFF files have 4-byte ASCII
identifiers. It is natural to write if(!strcmp(tag, "HEAD")), but of
course this will break on non-ASCII machines.
No, you write something like:
if(!strcmp(tag, IFF_HEAD_TAG))
with a #define in an appropriate header.
It is rarely much of a
problem since in most environments you can assume ASCII. If you put
the ASCII codes in the tag would become unreadable.
If you look at the Chinese character sets you might find this is not
true. Also, if you look at, for example, the character sets as used by
Germans you will find than not all characters counted as letters are
in the caught by:
if ((ch>= 'a' && ch <= 'z') || (ch >= 'A' && ch <= 'Z'))
since you have various accented letters.