Because it's idiomatic, and most of the time, code follows that idiom.
Idioms don't have to have any reason other than "that's how it's been done
before". It's a communications tool; given a general pattern that it's the
varying part on the left, and the invariant part on the right, that's what
I expect whenever I see a comparison operator.
Because that's how the majority of code has been written. Why is that? I
don't know. It's probably some combination of the pronunciation ("while
i is less than max" is more idiomatic than "while max is greater than or
equal to i") and the first few C books using it.
Why? Consider: an ordinary reversal loop:
while(f < h)
{
e = a[f];
a[f++] = a[h];
a[h--] = e;
}
Which is the constant now? Should it be f<h, or h>f? (Strictly, they're
not equivalent, but in this case either will do.)
Indeed, in that case, either will do.
But in many cases, there's a clear preference, and even if you don't share
it, you will understand most code better and/or more quickly if you keep
that pattern in mind.
K&R. I don't think you'll find a single test in there which goes the
other way.
Again, it's an idiom. It doesn't need a reason beyond the observation that
people tend to follow it. There's no objective reason for most social
norms, or linguistic conventions, but once we have them, it's useful to
use them to communicate -- both to be aware that other people may be using
them, and to use them ourselves to make communication easier.
Even though it may not seem like much, in a complicated loop or set of
nested loops, having all the conditions follow a consistent idiom makes
it much easier to follow and comprehend code. I'm not sure that which
idiom was picked would have mattered -- at this point, though, I've
seen thousands of loops with "p != NULL" as a condition, and extremely
few with "NULL != p", and similarly, thousands of "i < limit" and very
few "limit >= i", so when I see a condition, I read it that way first,
and only try something else if that works badly.
90% of the time, the heuristic is right, so I stick with it, and I
encourage other people to use it, because it's a very valuable tool.
It's the same reason I advocate "char *x" rather than "char* x" or
"char * x" or "char\n*\nx" as a declaration -- it's a convention and it
seems to generally help me and other readers understand the code. Maybe
it's not helpful for everyone, but I simply haven't seen it cause any
problems in living memory.
-s