Experiences/guidance on teaching Python as a first programminglanguage

C

Chris Angelico

I suspect that your manual skills are rather better than mine. One of my
favourite expressions, perhaps because I only ever heard my dad use it, is
"like watching a cow handle a shotgun".

Heh. In D&D terms, I think that would be a DEX of 5 or 6. The average
for humans is 10-11, the average for adventurers is 12-13.

ChrisA
 
G

Gene Heskett

I suspect that your manual skills are rather better than mine. One of
my favourite expressions, perhaps because I only ever heard my dad use
it, is "like watching a cow handle a shotgun".

I'll plead to using a jig, and figure I have a good fit when I have to
drive it together with a deadblow hammer.

Cheers, Gene
--
"There are four boxes to be used in defense of liberty:
soap, ballot, jury, and ammo. Please use in that order."
-Ed Howdershelt (Author)
Genes Web page <http://geneslinuxbox.net:6309/gene>

IBM's original motto:
Cogito ergo vendo; vendo ergo sum.
A pen in the hand of this president is far more
dangerous than 200 million guns in the hands of
law-abiding citizens.
 
R

rusi

Yes. In esr's essay on becoming a hacker[1] he says:
"""There is perhaps a more general point here. If a language does too
much for you, it may be simultaneously a good tool for production and
a bad one for learning."""

There is this principle by Buchberger called the "Black-box White-box
principle"
Unfortunately I can only find mathematicians talking about it
http://www.math.rutgers.edu/~zeilberg/Opinion65.html
and no CS-ists/programmers.

eg
To teach OS, minix is better than linux
To use, linux is better
FreeBSD may be a good middle point. It may also be a bad middle point --
practically too hard to use or study. Which is why in practice separating
teaching tools from professional ones is better than thrashing about using
the same for both
Definitely true, though I think it has exceptions.

Yeah As esr says python is an exception
And even here as it progresses it becomes more professional and less educational
 
S

Steven D'Aprano

I had a paper some years ago on why C is a horrible language *to teach
with* http://www.the-magus.in/Publications/chor.pdf


Nice paper! I have a few quibbles with it, but overall I think it is very
good.

I believe people did not get then (and still dont) that bad for -
beginner education (CS101)
- intermediate -- compilers, OS, DBMS etc - professional software
engineering

are all almost completely unrelated

I do not believe that they are "almost" unrelated. I think that, in
general, there is a very high correlation between languages which are
easy to use correctly and languages which are easy to learn. (Not the
other way around -- languages which are easy to learn may only be so
because you can't do much with them.)

If your aim is to challenge yourself, as the hacker ethos often leads to,
then C is an excellent language to learn because both *learning* and
*using* the language is a challenge. If you just want to write a program
which works correctly with as little fuss as possible, you surely
wouldn't choose C unless it was the only language you knew.

There are many reasons why languages fail to become popular among hackers
(PHP and Flash are too déclassé, being used by *cough spit* web
developers; Java is for people who wear suits and ties; Forth is, well
Forth is just too weird even for hackers who like Lisp, Scheme, or
Haskell). But the popular old-school hacker languages like Perl, Lisp and
C have three things in common:

- they grew organically, and so have little in the way of design
constraints (apart from the most fundamental, which the average
programmer doesn't even recognise as a constraint -- see the Blub
paradox);

- they are powerful and can do (nearly) anything, with sufficient hard
work;

- and they are challenging to use.

That last one is, I believe, the key. Nothing will get hackers and
programmers sneering at a language as being "a toy" or "not a real
language" if it makes programming too easy, particularly if there are
performance or functionality costs to such ease of use.

I would really like to see good quality statistics about bugs per program
written in different languages. I expect that, for all we like to make
fun of COBOL, it probably has few bugs per unit-of-useful-work-done than
the equivalent written in C.

Of course, this is very hard to measure: different languages require
different amounts of code to get something useful done. Different
languages get used for different things -- there are no operating system
kernels written in COBOL, although there are plenty of business apps
written in C. There are vast differences in software methodologies. But I
think that people intuitively grasp that if something is hard to learn,
as C is, chances are very good that it is equally hard to use even for
experts. Why do you think that C programs often have so many bugs and
vulnerabilities, such as buffer overflows and the like? It's not just
down to lousy coders.

In your paper, you quote Meyer:

I knew that the bulk of the student's time was spent fighting
tricky pointer arithmetic, chasing memory allocation bugs,
trying to figure out whether an argument was a structure or a
pointer, making sure the number of asterisks was right, and so
on?

It's not just students who have to do all these things. They may become
easier with experience, but "someone who chases memory allocation bugs"
is practically the definition of a C programmer.

Good C programmers are good in spite of C, not because of it.

Languages like C, which lack clean design and semantics, means the
programmer has to memorise a lot of special cases in order to become
expert. People who would otherwise make excellent programmers except for
their difficulty memorising special cases will make poor C coders -- they
will always be fighting the language. Or they will learn just a small
subset of the language, and always use that. If the compiler is efficient
enough, as C compilers typically are, you'll never know from the
performance that it was written poorly or non-idiomatically.
 
R

Rick Johnson

Of course, this is very hard to measure: different languages require
different amounts of code to get something useful done. Different
languages get used for different things -- there are no operating system
kernels written in COBOL, although there are plenty of business apps
written in C. There are vast differences in software methodologies. But I
think that people intuitively grasp that if something is hard to learn,
as C is, chances are very good that it is equally hard to use even for
experts. Why do you think that C programs often have so many bugs and
vulnerabilities, such as buffer overflows and the like? It's not just
down to lousy coders.

I have a real life example of such horrendous design flaws
involving a highway intersection. A while back, can't
remember when, a friend of mine was involved in an accident
that was his fault. This surprised me because i consider
this person to be a very cautious driver.

After checking records, i was amazed to find a high
occurrence of traffic incidents that mirror the exact
conditions of my friends accident, and in almost every
incident, the person at fault runs the thru the red light.

This seemed odd because how could so many people be making
the same mistake? The sheer number of signal violations
would exclude malevolent intentions.

So being the curious chap i am, i investigated further. I
myself traveled the course my friend took the day of the
fateful accident.

The course involves starting from a traffic signal on one
side of the freeway, following a long left turn lane under
the freeway, and then emerging on the other side to cross the
opposing feeder road -- it is a this point that the
accidents happen with great frequency!

There are two distinct design flaws contributing:

1. The bridge itself is obscuring the view of the
second signal. The second signal is not visible until
the motorist are very close -- much too close in my
opinion!

But i feel #2 is the *real* contributing factor!

2. The second signal and the first signal are not
synchronized, creating an inconsistency between both
signals. For example, sometimes you can catch both
lights green, but sometimes, the second light will
change unexpectedly whilst you're navigating the long
left turn, THUS requiring all traffic to stop under the
freeway before crossing the opposing feeder road.

...and the results of this poor design are resulting
in injuries on a regular basis!!!

The problem is, sometimes people don't stop. Sometimes they
simply "assume" that the light will be green because
stopping under a bridge does not "feel" normal. Of course
they must accept the blame for not being more alert,
however, at what point does the fallibility of humans excuse
poor interface design?

Humans are by nature INCAPABLE of maintaining perfect
alertness, and driving requires more processing than the
human mind can possibly muster. Your mind is constantly
attempting to "predict the future" outcome of current
events, and it is this unconsciousness mechanism that, when
overloaded, will force the less acute modality of intuition
to propagate up and take full control.

It is for that very reason that we must design interfaces
with the fallibility of human operators in mind -- at least
until we can remove the human from the equation.

We MUST strive to achieve the highest level of
intuitiveness whilst eliminating any and all inconsistencies
from the system.
 
O

Oscar Benjamin

Thanks for sharing your experiences Wolfgang. I think many of my
students have a similar experience after learning C and it is
interesting to hear it from your perspective many years later.

I was also taught C as an undergrad but having already learned Java, C
and C++ before arriving at University I found the C course very easy
so my own experience is not representative. Many of the other students
at that time found the course too hard and just cheated on all the
assignments (I remember one students offering to fix/finish anyone's
assignment in exchange for a bottle of cider!).
I had a paper some years ago on why C is a horrible language *to teach with*
http://www.the-magus.in/Publications/chor.pdf

Thanks for this Rusi, I just read it and it describes very well what I
think about our own C course. My choice quote from the beginning would
be "When the irrelevant becomes significant, the essentials become
obscured and incomprehensible."

(BTW is there any reason that the document is repeated twice in the same pdf?)

As a case in point one of my tutees asked for help with his C
assignment last week. I looked at his code and it was a complete mess.
I explained roughly what it should look like and he explained that he
had had so much trouble figuring out how to get the compiler to pass a
pair of strings into a function that he had given up and used global
variables instead. He's just not ready yet to get an intuitive
understanding of where to put the asterisks in order to make it work -
and as you point out in that paper the rules for where the asterisks
go are hopelessly inconsistent.

A couple of weeks before, another of my tutees brought their
assignment which was about dynamic memory allocation (~7 weeks into
her first programming course). She had just written something like
char *x = (char*)malloc(31*sizeof(char));
for a global x at the top of the file. So the message about dynamic
memory allocation was entirely lost in the details of C: "dynamic
memory allocation means using malloc".

These types of problems are compounded by the fact that the current C
course uses automated marking so a program that produces the correct
output gets full marks even if it is terribly written and the student
entirely misses the point - another thing about this course that
definitely needs to change.
I believe people did not get then (and still dont) that bad for
- beginner education (CS101)
- intermediate -- compilers, OS, DBMS etc
- professional software engineering

are all almost completely unrelated

Agreed.


Oscar
 
S

Steven D'Aprano

These types of problems are compounded by the fact that the current C
course uses automated marking so a program that produces the correct
output gets full marks even if it is terribly written and the student
entirely misses the point

This suggests that even the lecturers can't read C, and so have got one
of their post-grad students to write an automated tester so they don't
have to.

Only-half-joking-ly y'rs,
 
C

Chris Angelico

I was also taught C as an undergrad but having already learned Java, C
and C++ before arriving at University I found the C course very easy
so my own experience is not representative. Many of the other students
at that time found the course too hard and just cheated on all the
assignments (I remember one students offering to fix/finish anyone's
assignment in exchange for a bottle of cider!).

Student cheats on assignment and gets, in effect, a fraudulent
certification. (Piece of paper claims competence, competence doesn't
exist.) Graduating student shows certification to employer. Employer
hires ex-student, because employer doesn't know good code from bad
(hence hiring someone). Ex-student writes a pile of junk, then leaves
for a better opportunity. Real programmer is hired, or seconded from
another project, to fix a few small issues in ex-student's code.
Lunatic asylum gains another patient.

It's all too common. I'd like to tell people that they're only
cheating themselves, but the trouble is, they're cheating other people
a lot more.

ChrisA
 
N

Neil Cerutti

I would really like to see good quality statistics about bugs
per program written in different languages. I expect that, for
all we like to make fun of COBOL, it probably has few bugs per
unit-of-useful-work-done than the equivalent written in C.

I can't think of a reference, but I to recall that
bugs-per-line-of-code is nearly constant; it is not language
dependent. So, unscientifically, the more work you can get done
in a line of code, then the fewer bugs you'll have per amount of
work done.
 
W

Wolfgang Keller

It's not just the abysmally appalling, hideously horrifying syntax.
I've never heard C syntax reviled quite so intensely. What syntax do
you like, out of curiosity?

Pascal, Python, if written by someone who uses semantic identifiers and
avoids to use C(++)/Java-isms. I've seen Eiffel as well (without
understanding it) and it didn't look ridiculous to me.

In short, syntax that contains the strict minimum of "special"
characters (delimiting lists etc. with brackets is ok to me), and
almost exclusively human readable words. Although, if you push it to the
extreme; Applescript is nice to read, but much less nice to write
imho... :-/

C, C++, Java, Javascript, PHP, Perl etc., however, are just
unspeakable <expletives>.

<rant>

BTW; Yes, I do *hate* those C(++)-isms (or Java-isms) that have started
to sneak into Python in the past ~10 years. Using e.g. == for
comparisons is just braindead. Use := for assignments instead, because
that's mathematical syntax. And that "@" for decorators is, well, who
proposed it? I'd like to cut off all his fingers with a bolt cutter.
The same for people who use augmented assignments, "syntax shortcuts"
or abbrvtd idtfrs. Ship them all to Fukushima, one way, no return
ticket. Learn to touch-type, get an editor with decent syntax
completion or just stop wreaking havoc to the world economy with your
laziness. Code is read a hundred times more often than it is typed.

</rant>

Sincerely,

Wolfgang
 
S

Steven D'Aprano

Well, there was that little Y2K thing...

Oh come on, how were people in the 1990s supposed to predict that they
would be followed by the year 2000???

That's a good point, but that wasn't a language issue, it was a program
design issue. Back in the 70s and 80s, when saving two digits per date
field seemed to be a sensible thing to do, people simply didn't imagine
that their programs would still be used in the year 1999[1]. That's not
the same sort of bug as (say) C buffer overflows, or SQL code injection
attacks. It's not like the COBOL language defined dates as having only
two digits.




[1] What gets me is that even in the year 1999, there were still
programmers writing code that assumed two-digit years. I have it on good
authority from somebody working as an external consultant for a bank in
1999 that he spent most of 1998 and 1999 fixing *brand new code* written
by the bank's own staff. You'd think that having lived through that
experience would have shaken his belief that private enterprise does
everything better, and the bigger the corporation the better they do it,
but apparently not. Go figure.
 
M

Mark Lawrence

Well, there was that little Y2K thing...

Oh come on, how were people in the 1990s supposed to predict that they
would be followed by the year 2000???

That's a good point, but that wasn't a language issue, it was a program
design issue. Back in the 70s and 80s, when saving two digits per date
field seemed to be a sensible thing to do, people simply didn't imagine
that their programs would still be used in the year 1999[1]. That's not
the same sort of bug as (say) C buffer overflows, or SQL code injection
attacks. It's not like the COBOL language defined dates as having only
two digits.




[1] What gets me is that even in the year 1999, there were still
programmers writing code that assumed two-digit years. I have it on good
authority from somebody working as an external consultant for a bank in
1999 that he spent most of 1998 and 1999 fixing *brand new code* written
by the bank's own staff. You'd think that having lived through that
experience would have shaken his belief that private enterprise does
everything better, and the bigger the corporation the better they do it,
but apparently not. Go figure.

I was in charge of the team at work that had to make all code Y2K
compliant. I discovered the one bug that to my knowledge slipped
through the net. Four years later back at the same place on contract I
fixed the fix!!!
 
L

Larry Martell

I was in charge of the team at work that had to make all code Y2K compliant.
I discovered the one bug that to my knowledge slipped through the net. Four
years later back at the same place on contract I fixed the fix!!!
From around 1997 till 2000 all I did was fix Y2K bugs. I'm pretty sure
I got them all. For one client I fixed well over 200. After the new
year came and nothing broke, the owner of the company said "You made
such a big deal about this Y2K stuff, and it turned out not to be a
problem at all."
 
R

rusi

Hahaha -- Very funny and serious. Ive been actually experienced being
kicked out of job for writing decent working code and not making a big
deal of it.

Comes back the start of the thread -- What do we teach students?
Should we teach how to write the best possible code and as effortlessly
as possible? Or should we also teach how to make a fuss, how to pretend
to (over)work while actually (under)delivering?

In a Utopia this would not be a question at all.
But we dont live in Utopia...

[And there are languages WAY better than C... C++ for example]
 
G

Grant Edwards

The problem with the C class wasn't that it was "hard". I had passed my
Pascal class, which taught nearly exactly the same issues with
"straight A"s before (without ever having writeen any source code ever
before). And by standard cognitive testing standards, I'm not exactly
considered to be an idiot.

I agree that C is a awful pedagogical language. When I was in
university, the first language for Computer Science or Computer
Engineering students was Pascal. After that, there were classes that
surveyed Prolog, SNOBOL, LISP, Modula, APL, FORTRAN, COBOL, etc. If
you were an "other" engineering/science major, you learned FORTRAN
first (and last). I think there may also have been some business
types who were taught BASIC.

C wasn't taught at all. When I graduated and started doing real-time
embedded firmware, the choices were Generally C or Pascal. The first
projects I did were in Pascal, but I learned C because the development
host was a PDP-11 running Unix and I needed to write some small (non
embedded) utilities. Today, all my embedded work is in C. Python
fell out of style for some reason, but (with a few extensions) it was
a fine language for embedded work as well.

I've always thought C was a great language for low-level, bare-metal,
embedded stuff -- but teaching it to first or second year computer
science students is just insane. C has a certain minimalist
orthogonality that I have always found pleasing. [People who smile
wistfully when they think about the PDP-11 instruction word layouts
probably know what I mean.]

But, exposure to C should wait until you have a firm grasp of basic
algorithms and data structures and are proficient in assembly language
for a couple different architectures. Ideally, you should also have
written at least one functioning compiler before learning C as well.
 
O

Oscar Benjamin

The problem with the C class wasn't that it was "hard". I had passed my
Pascal class, which taught nearly exactly the same issues with
"straight A"s before (without ever having writeen any source code ever
before). And by standard cognitive testing standards, I'm not exactly
considered to be an idiot.

Please don't misunderstand me: I'm certainly not saying that you're an
idiot. Also I'm sure many of the students on my course would have
fared better on a course that was using e.g. Python instead of C.

Well actually come to think of it some of the other students were
pretty stupid. The lecturer had explained that they were using a
plagiarism detector so if you copy-paste code from someone else they
could catch you out for cheating. A few people took that literally and
thought that it could detect copy-pasting (in plain text files!). The
rumour went round that it would be okay if you printed out the code
and then typed it back in. For some reason they didn't bother running
the plagiarism detector until about 6 weeks into the course by which
time ~20% of submissions were exact duplicates of at least one other
(according to the lecturer who announced that all such students would
get zero marks for those assignments).


Oscar
 
L

Larry Martell

I did that, but my fee was a case of beer.
I agree that C is a awful pedagogical language. When I was in
university, the first language for Computer Science or Computer
Engineering students was Pascal. After that, there were classes that
surveyed Prolog, SNOBOL, LISP, Modula, APL, FORTRAN, COBOL, etc. If
you were an "other" engineering/science major, you learned FORTRAN
first (and last). I think there may also have been some business
types who were taught BASIC.

C wasn't taught at all.

It wasn't for me either when I went to college in the late 1970's.
Pascal first, then FORTRAN, then IBM 360 assembler. That was all the
formal language training I had. (I had taught myself BASIC in high
school.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,079
Messages
2,570,575
Members
47,207
Latest member
HelenaCani

Latest Threads

Top