I've been utilising C for lots of small and a few medium-sized personal
projects over the course of the past decade, and I've realised lately
just how little progress it's made since then. I've increasingly been
using scripting languages (especially Python and Bourne shell) which
offer the same speed and yet are far more simple and safe to use. I can
no longer understand why anyone would willingly use C to program
anything but the lowest of the low-level stuff. Even system utilities,
text editors and the like could be trivially written with no loss of
functionality or efficiency in Python. Anyway, here's my reasons. I'd
be interested to hear some intelligent advantages (not
rationalisations) for using C.
Python and Perl are not as fast as compiled C. ... and they never will
be.
No string type
--------------
You find that a problem. Personally I don't have that many problems
working with strings. When I need some sort of functionality I write a
function and use it. Not exactly challenging.
Functions for insignificant operations
Oh no, you have to include a header. END OF THE WORLD!
and Perl doesn't have "requires" ???
The encouragement of buffer overflows
-------------------------------------
This is exactly like saying kitchen knifes encourage stabbings.
Functions which encourage buffer overflows
------------------------------------------
They're unsafe but they don't encourage jacksquat. Any developer worth
their beans wouldn't use these functions, they won't get linked into
your application [at compile or runtime]. The only downside is that
they are available at all.
You see, even if you're not writing any memory you can still access
memory you're not supposed to. C can't be bothered to keep track of the
ends of strings; the end of a string is indicated by a null '\0'
character. All fine, right? Well, some functions in your C library,
such as strlen(), perhaps, will just run off the end of a 'string' if
it doesn't have a null in it. What if you're using a binary string?
Careless programming this may be, but we all make mistakes and so the
language authors have to take some responsibility for being so
intolerant.
It gives you control over the machine. Unorthodox things like that
were used for VGA programming back in the 90s [negative addressing
anyone?]. Just because you're too lazy to actually bounds check your
code doesn't mean others are.
No builtin boolean type
-----------------------
If you don't believe me, just watch:
$ cat > test.c
int main(void)
{
bool b;
return 0;
}
$ gcc -ansi -pedantic -Wall -W test.c
test.c: In function 'main':
test.c:3: 'bool' undeclared (first use in this function)
Not until the 1999 ISO C standard were we finally able to use 'bool' as
a data type. But guess what? It's implemented as a macro and one
actually has to include a header file to be able to use it!
So what? int == register size. It makes sense to use an int for
flags. If you're tight on memory use a bitfield but that usually will
end up making more code than you save on ram.
High-level or low-level?
------------------------
On the one hand, we have the fact that there is no string type and
little automatic memory management, implying a low-level language. On
the other hand, we have a mass of library functions, a preprocessor and
a plethora of other things which imply a high-level language. C tries
to be both, and as a result spreads itself too thinly.
Generally any significant program requires support libraries. E.g. a
networking lib, a threading lib, a crypto lib, a ... the standard C
library only provides the core essentials to get off the ground [I/O,
string ops, heap ops and a few things like bsearch and qsort]
The great thing about this is that when C is lacking a genuinely useful
feature, such as reasonably strong data typing, the excuse "C's a
low-level language" can always be used, functioning as a perfect
'reason' for C to remain unhelpfully and fatally sparse.
No the excuse "just write one and use it" comes to mind. I mean I
don't write an AES routine every time I need a cipher. I wrote
[well...ported] one long ago and I just use it whenever I want. Why is
that so hard?
The original intention for C was for it to be a portable assembly
language for writing UNIX. Unfortunately, from its very inception C
has
Wrong. The original use for C was a portable language that was low
enough to be somewhat mappable to CPU instructions. That's why you
don't see string ops, classes, oop, etc...
Integer overflow without warning
--------------------------------
Self explanatory. One minute you have a fifteen digit number, then try
to double or triple it and - boom - its value is suddenly
-234891234890892 or something similar. Stupid, stupid, stupid. How hard
would it have been to give a warning or overflow error or even just
reset the variable to zero?
It's possible at runtime todo this. Just the resulting code will be
much slower. If you know you're going to need big numbers just use a
bignum lib like GMP or LibTomMath.
This is widely known as bad practice. Most competent developers
acknowledge that silently ignoring an error is a bad attitude to have;
this is especially true for such a commonly used language as C.
Overflows aren't always errors. e.g. a rotate
x = ((x << 3) | (x >> 29)) & 0xFFFFFFFF;
Again this is about control. If I couldn't do this I would have to
x = (((x&something)<<3)|(x>29);
but if the rotate count is dynamic [say RC5] then the "something" has
to be calculated on the fly [re: slow].
Shut up. I write code that compiles in dozens of compilers on several
many dozen platforms. At most I have small issues from time to time
but they're rare. I personally maintain over 100k lines of code,
documentation, demos, etc in my LibTom series and most of this code is
used in production environments ranging from routers to desktops to
banking to console gaming on processors ranging from ARM, PPC, Sparc,
MIPS and x86 with compilers such as MSVC, Borland, Metrowerks and
flavours of GCC.
It's about the "middle" road. I don't use C99 features I can avoid and
I don't use platform specifics [e.g. pragma or others]. The resulting
code "just works".
C is unable to adapt to new conditions for the sake of "backward
compatibility", throwing away the opportunity to get rid of stupid,
utterly useless and downright dangerous functions for a nonexistent
goal. And yet C is growing new tentacles and unnecessary features
because of idiots who think adding seven new functions to their C
library will make life easier. It does not.
You don't have to call gets you know right?
<snip>
This post is just way too long. You're whining about things that just
sound like "oh I can't develop proper software and everyone else is to
blame".
C isn't the answer for everything. Anyone who says otherwise is a
liar. However, C does have very REAL uses and they're not just "low
level drivers" or such.
Tom