typedef unsigned int size_t
size_t
when to declare something size_t? when the variable is associated with memory usage?
at any other time should the variable be declared an unsigned int?
it's not a question of style, right?
It's partly a question of style.
My own view is that size_t should never have been introduced. It causes far more problems than it
solves.
The original idea was that it would hold the size in bytes of an object in memory. Typically,
machines have an address space of 4GB. So if you want an object of over 2GB in size, you can't pass
an int to malloc(), as was the interface in old C.
But unfortunately the ANSI committee also used size_t for counts of objects in memory. If you have
a string of over 2GB, an int won't hold the length. sort also takes two size_ts.
But if your count of objects in memory is a size_t, then your index variable which goes from 0 to
N-1 must also be a size_t. That's where the problems start.
Firstly, if sizes in bytes, counts of objects, index variables, and intermediate variables used in calculating
indices are all size_t, then that's practically all the integers in a typical C program. So plain int fades
away, it's no useful any more. Except that it's intuitive to say "int i" when you want an integer, not
size_t i, when i doesn't hold a size. So in fact code that uses size_t is littered with conversions from
int to size_t. The other problem is that size_t is unsigned. So you have to be careful with code like
for(i=0;i<N-1;i++)
if we use ints, the loop body won't execute, which is probably the intention. if we use size_t, we'll get either a crash or a very long delay, depending on whether i indexes into memory or not.
My own view is, don't use size_t at all. Just pass ints to the standard library functions and pretend it
was never invented. You're much more likely to get a size_t bug than to have to deal with N > 2G.
But of course I'm advocating writing code which, strictly, is incorrect. So it's hardly the ideal answer.
There isn't an ideal answer. The committee has imposed on us something that makes sense maybe
in the small embedded world, and certainly makes sense in a non-human way of thinking, but is
just a danger to actual programmers writing scalable algorithms.