user923005 said:
Can those same languages create objects with a size to large to be
held in an integer?
Consider this Java code:
byte[] foo = new byte[N];
N must be type int. In Java, an int is a 32 bit signed value.
Therefore, you can't create a byte array with more than 2^31-1
elements.
Now consider this Java code:
short[] foo = new short[N];
Presumably, this could work on a 64 bit JVM, where N = 2^31-1.
The size of the resulting object, in bytes, is larger than the maximum
value a Java int can hold.
Full disclosure: I do not have access to a system capable of testing
this. These conclusions are based on my understanding of the Java
language.
If 'yes', then those languages are defective. If 'no', then integer
is the correct return.
A pointless observation. All programming languages are defective in
at least one way or another. ALL of them.
My point stands: Somehow, other programming languages get by just fine
returning an int when asked for the length of a string.
I can create a language with a single type. Somehow, I think it will
be less effective than C for programming tasks.
You may decide a programming language with only signed integer types
is less effective than C for programming tasks if you like; however,
it doesn't dimish the success or usefulness of those other languages.
Nor is that the only thing that should be considered when choosing a
programming language.
The way to minimize the pain of writing 100% portable code is to write
it correctly, according to the language standard. For instance, that
would include using size_t for object sizes. Now, pre-ANSI C did not
have size_t. So that code will require effort to repair.
Writing 100% portable C code is extremely non-trivial and when taken
to an extreme can interfere with the progress of a project.
I understand why size_t was invented, but I have some suspicions a
more pragmatic approach may have been superior, such as returning int
from strlen() instead of size_t.