On 12/05/14 14:30, Malcolm McLean wrote:
Therefore, /if/ you can afford the resources (typically memory), then it
is almost always better to use fixed sizes that are determined at
compile time. A statically allocated array of fixed size is better in
almost every way than a dynamically malloc'ed array, except that you
cannot (easily) reuse the same memory for other purposes, and you need
to know the sizes at compile time.
This is the reason why high reliability and safety-critical systems
usually ban any sort of dynamic memory. And that is precisely the sort
of system that requires /real/ rigorous development methods, and /real/
good design. When you start using "dynamic this" and "general that",
you end up with a system that cannot be analysed and characterised
properly, and consequently has much greater risks for reliability.
The Turing machine running out of tape is a fundamental theoretical problem,
I agree.
You're mixing up "dynamic memory" with "non-fixed-size arrays". That's because C
doesn't enforce a rigorous distinction between an array and a buffer. C
programmers tend to say
char line[1024];
fgets(line, 1024, fp) ;
"Ok, I've got an array of 1024 bytes".
They don't. They've got a buffer of 1024 bytes, an array of however many characters
fgets happened to read, plus the nul.
You are inventing a distinction between what /you/ call a buffer, and
what /you/ call an array. I think it is likely that our opinions are
not as far apart as first seems regarding dynamic and fixed sizes in
different types of program - it is just that you have invented your own
ideas about what "fixed-size array" and "non-fixed-size array" mean.
So lets get this anchored in the /real/ world - the one in which we
program in C, as defined by the C standards, using terminology common in
the standards as well as literature about the C language.
char line[1024];
This defines a fixed-size array of 1024 characters. It can be used to
store anything the user likes, such as strings, characters, or any other
data. It /always/ has a fixed length of 1024 characters, and it is
/always/ an array. It might not always contain 1024 characters worth of
useful or valid data, but that does not affect the size of the array.
fgets(line, 1024, fp);
This reads up to 1024 characters from "fp" into "line". We commonly say
"line" is a "buffer" here - that is one use of an array.
After the "fgets", line is /still/ a fixed-length array of 1024
characters. Only a certain number of characters in it are actually
valid - one can say "there are 'X' characters in buffer 'line'". But
the data type and size of line has never changed - it is always a
fixed-length array of fixed size 1024.
Dynamic memory and algorithms which scale in N are obviously linked ideas, but
they're not quite the same thing. If we call strlen() on line, we'd expect strlen to
scale gracefully to any line length. That's obvious, few people would make the mistake
of hardcoding strlen() to a limited buffer size of 1024. But when it's less obvious,
people do often write dependencies into code that they shouldn't.
No, people who are serious programmers except dependencies and
limitations on all aspects of the system and the code. We are not
programming for Turing machines - we understand that things are limited.
I certainly do /not/ expect strlen() to scale gracefully to any line
length - I expect it to be limited by the target, and /usually/ I expect
it to handle lines of any /practical/ length for the target and
application in question. But I don't expect it to "scale gracefully" to
strings longer than 2^31 on a 32-bit target. I don't expect it to be
happy with strings longer than 2^15 on a 16-bit target - even if the
target has more than 64K memory. On some targets, I don't expect it to
work at all on strings in flash, because I know that one some targets,
flash and ram have different memory spaces. For some types of
programming, I don't expect it to work at all because I cannot be sure
the target string is properly terminated. And on some targets, I don't
expect it to work for long strings because I have limitations on how
much time a function is allowed to take. And on some targets, I might
well make my own strlen() function hardcoded to a limit of 1024 in order
to minimise the damage if it were called on an unterminated string,
because I know that no valid string would be longer than 1024 bytes.
You are making all sorts of unwarranted assumptions because you don't
understand the type of programming that is done in C, and you think your
own little niche of experience covers everything.
Of course, I don't disagree that people often write dependencies when
they should not, or have dependencies or limitations that are not
obvious and not documented, and people often fail to use static
assertions and static compile-time checks when they could catch
conflicts with these limitations.