Is this some kind of troll? Possibly, however...
My own experience with programming makes me feel that counting arrays
from 0 is the natural thing. If you are handling an array 5 items
starting from x, going from x+0 to x+4 is more concise than going from
x+1-1 to x+5-1. There are other examples in which I have found it
convenient.
So is mine, but here eventually

my Ada background comes in handy,
we are used to declaring our own bounderies.
I believe that all is habit, but being used to 0 makes life easier I believe.
We could also argue about the values #times is yielding, right
Chinese count birthdays starting with the child's first year as year
1. As a westerner do you consider it natural describe a child as 2
years old on its "first" birthday?
You see, all habit, and it shows some intellectual short sight to
dismiss ideas foreign to our paradigm -- that's why I am doing it all
the time

.
It may be counterintuitive but in the mathematical world it is quite natural.
Or maybe common would be a better world;
In Austria where I went to school 0 is not a positive number. In
France I am constantly confronted with the notation that a whole
number is either positive or negative defining 0 of being part of the
former, drives me crazy, but makes lots of texts much shorter -- they
seem never to use natural number for positive including 0.
But maybe some learned francophone mathematician can mail me off list,
I am only working with engineers, natural as an engineer, you know
Cheers
Robert