Usually from contamination with C. If it has curly braces, or the
implementation is based on a curly-braced language, the probability is that
the arrays are 0-based.
Higher-level languages have less reason start to counting from zero which is
less intuitive. Most counting in real-life is 1-based.
1. Indexing isn't necessarily counting!
[ ] [ ] [ ]
^ ^ ^ ^
0 1 2 3
The array index is the left corner of the box. The count is the right corner.
The index indicates: what is the displacement? How many elements are
before this one?
2. Counting is not one based. It is zero based. To count items, you must
start with zero:
count = 0
while (uncounted items remain) {
check off next item
count ++
}
if there are no items, the count is zero. With the indexing
diagram, again, counting works like this:
step 0: initialize
[ ] [ ]
^
0
step 1: first box is counted
[ ] [ ]
^ ^
0 1
step 2: second box is counted
[ ] [ ]
^ ^ ^
0 1 2
C certainly uses one-based counting for array length: an array
which contains only element [0] has length 1.
I would be surprised if Lisp couldn't support N-based arrays if it wanted
to; if not then it would be the only thing it couldn't do.
Indeed, ANSI Common Lisp has displaced arrays: array objects which virtually
reference the data in other arrays, with displacement.
Those were not there in the beginning; just zero-based arrays.
Lua and Ada are 1-based. The latter also has N-based.
What's stupid is having the 1st, 2nd, 3rd and 4th elements of an array
indexed as 0, 1, 2 and 3.
Not at all. Index 0 indicats that the array is empty b efore we push
the first element there.
These concepts can coexist in a language. Lisp:
(elt '(a b c) 0) -> a
(first '(a b c)) -> a
The symbols first, second, ... are never subject to scaling or
displacement, and correspond to natural language concepts.
Note that clocks measure the day from 00:00 to 23:59, not from 01:01
to 24:60. People generally do not have a problem with this.
Also, countdown timers go to zero. If you cook something with your
microwave for 27 seconds, it starts at 27, and counts down to zero,
once per second.
When year 2000 rolled around, numerous people around the world
thought that it's the start of the new millennium and celebrated.
Those pointing out that it actually ranges from 2001 to 3001 were ridiculed as
dweebs and party poopers.
"Ordinary people" can, and do, regard zero based systems as natural,
while at the same time regarding one based counting as natural also,
depending on context.
If the choice is *only* between 0 and 1, then 0 is more versatile. But it's
not hard to allow both or any. Both 0 and 1 bases have their uses; 1-based I
think is more useful as the default base.
So "useful" and "versatile" are opposites, of sorts.
But imagine you had a language feature that looked like this:
x = ( n | a, b, c, ... | z);
which selects one of a, b, c etc depending on n (with z being a default
value); ie. it selects the nth value.
Should n be 1-based (so n=1 selects a), or 0-based (n=1 selects b); which is
more natural?
Zero all the way, without a question.
For instance, suppose I want to regard that list as pairs. I want to select
either (a, b) or (c, d) based on n. It's easy: just take elements 2*n,
and 2*n+1. If n is 1 based, I have to do algebra: 2*(n-1) and 2*(n-1)+1,
which goes to 2n-2 and 2n-2+1 = 2n-1.
Indexing multi-dimensionally gets even more retarded.