I had a paper some years ago on why C is a horrible language *to teach
with*
http://www.the-magus.in/Publications/chor.pdf
Nice paper! I have a few quibbles with it, but overall I think it is very
good.
I believe people did not get then (and still dont) that bad for -
beginner education (CS101)
- intermediate -- compilers, OS, DBMS etc - professional software
engineering
are all almost completely unrelated
I do not believe that they are "almost" unrelated. I think that, in
general, there is a very high correlation between languages which are
easy to use correctly and languages which are easy to learn. (Not the
other way around -- languages which are easy to learn may only be so
because you can't do much with them.)
If your aim is to challenge yourself, as the hacker ethos often leads to,
then C is an excellent language to learn because both *learning* and
*using* the language is a challenge. If you just want to write a program
which works correctly with as little fuss as possible, you surely
wouldn't choose C unless it was the only language you knew.
There are many reasons why languages fail to become popular among hackers
(PHP and Flash are too déclassé, being used by *cough spit* web
developers; Java is for people who wear suits and ties; Forth is, well
Forth is just too weird even for hackers who like Lisp, Scheme, or
Haskell). But the popular old-school hacker languages like Perl, Lisp and
C have three things in common:
- they grew organically, and so have little in the way of design
constraints (apart from the most fundamental, which the average
programmer doesn't even recognise as a constraint -- see the Blub
paradox);
- they are powerful and can do (nearly) anything, with sufficient hard
work;
- and they are challenging to use.
That last one is, I believe, the key. Nothing will get hackers and
programmers sneering at a language as being "a toy" or "not a real
language" if it makes programming too easy, particularly if there are
performance or functionality costs to such ease of use.
I would really like to see good quality statistics about bugs per program
written in different languages. I expect that, for all we like to make
fun of COBOL, it probably has few bugs per unit-of-useful-work-done than
the equivalent written in C.
Of course, this is very hard to measure: different languages require
different amounts of code to get something useful done. Different
languages get used for different things -- there are no operating system
kernels written in COBOL, although there are plenty of business apps
written in C. There are vast differences in software methodologies. But I
think that people intuitively grasp that if something is hard to learn,
as C is, chances are very good that it is equally hard to use even for
experts. Why do you think that C programs often have so many bugs and
vulnerabilities, such as buffer overflows and the like? It's not just
down to lousy coders.
In your paper, you quote Meyer:
I knew that the bulk of the student's time was spent fighting
tricky pointer arithmetic, chasing memory allocation bugs,
trying to figure out whether an argument was a structure or a
pointer, making sure the number of asterisks was right, and so
on?
It's not just students who have to do all these things. They may become
easier with experience, but "someone who chases memory allocation bugs"
is practically the definition of a C programmer.
Good C programmers are good in spite of C, not because of it.
Languages like C, which lack clean design and semantics, means the
programmer has to memorise a lot of special cases in order to become
expert. People who would otherwise make excellent programmers except for
their difficulty memorising special cases will make poor C coders -- they
will always be fighting the language. Or they will learn just a small
subset of the language, and always use that. If the compiler is efficient
enough, as C compilers typically are, you'll never know from the
performance that it was written poorly or non-idiomatically.