tough-to-explain Python

K

kj

In said:
Frankly, I'm of the impression that it's a mistake not to start
teaching programming with /the bit/ and work your way up from there.
I'm not kidding. I wrote a (draft) article about this: "Computer
Curriculum" http://docs.google.com/View?id=dgwr777r_31g4572gp4
I really think the only good way to teach computers and programming is
to start with a bit, and build up from there. "Ontology recapitulates
phylogeny"


I happen to be very receptive to this point of view. I had the
benefit of that sort of training (one of the first computer courses
I took started, believe it or not, with Turing machines, through
coding in machine language, and compiler theory, and all the way
up to dabbling with Unix!), and I suspect that the reason it is
sometimes difficult for me to explain even relatively simple-looking
things to others is that I have this background that I unconsciously,
and incorrectly, take for granted in others... There is this
persistent idea "out there" that programming is a very accessible
skill, like cooking or gardening, anyone can do it, and even profit
from it, monetarily or otherwise, etc., and to some extent I am
actively contributing to this perception by teaching this course
to non-programmers (experimental biologists to be more precise),
but maybe this idea is not entirely true... Maybe, to get past
the most amateurish level, one has to, one way or another, come
face-to-face with bits, compilers, algorithms, and all the rest
that real computer scientists learn about in their formal training...

kj
 
S

Steven D'Aprano

Python has adopted the latter of the three for operator / and the the
second one for operator //.

Up until a few years ago, / would return the integer division if both
arguments where ints or longs, and // didn't even exist. Even in Python
2.6, you still need to do

from __future__ import division

to get the behaviour you describe.

I wonder if it was considered to just return
a fraction from that operation.

Because of his experience with ABC, Guido dislikes using fractions for
standard arithmetic. He was resistant to even allowing the rational
module be added to the standard library.

The problem with fractions is that, unless you're very careful, repeated
arithmetic operations on numbers ends up giving you fractions like

498655847957/569892397664

instead of the 7/8 you were expecting. Not only is that ugly, but it
becomes slower and slower as the denominator and numerator become larger.

At least, that was Guido's experience.
 
H

Hans Nowak

Ulrich said:
Python has adopted the latter of the three for operator / and the the second
one for operator //. I wonder if it was considered to just return a
fraction from that operation.

http://python-history.blogspot.com/2009/03/problem-with-integer-division.html

"For example, in ABC, when you divided two integers, the result was an exact
rational number representing the result. In Python however, integer division
truncated the result to an integer.

In my experience, rational numbers didn't pan out as ABC's designers had hoped.
A typical experience would be to write a simple program for some business
application (say, doing one’s taxes), and find that it was running much slower
than expected. After some debugging, the cause would be that internally the
program was using rational numbers with thousands of digits of precision to
represent values that would be truncated to two or three digits of precision
upon printing. This could be easily fixed by starting an addition with an
inexact zero, but this was often non-intuitive and hard to debug for beginners."
 
S

Steven D'Aprano

In <5f0a2722-45eb-468c-b6b2-b7bb80ae5f19@q11g2000yqi.googlegroups.com>
Simon Forman said:
Frankly, I'm of the impression that it's a mistake not to start teaching
programming with /the bit/ and work your way up from there. I'm not
kidding. I wrote a (draft) article about this: "Computer Curriculum"
http://docs.google.com/View?id=dgwr777r_31g4572gp4
I really think the only good way to teach computers and programming is
to start with a bit, and build up from there. "Ontology recapitulates
phylogeny"


I happen to be very receptive to this point of view. [...]
There is this persistent idea "out there" that
programming is a very accessible skill, like cooking or gardening,
anyone can do it, and even profit from it, monetarily or otherwise,
etc., and to some extent I am actively contributing to this perception
by teaching this course to non-programmers (experimental biologists to
be more precise), but maybe this idea is not entirely true...

There is some evidence that 30-60% of people simply cannot learn to
program, no matter how you teach them:

http://www.codinghorror.com/blog/archives/000635.html
http://www.cs.mdx.ac.uk/research/PhDArea/saeed/

I'm sympathetic to the idea, but not entirely convinced. Perhaps the
problem isn't with the students, but with the teachers, and the
languages:

http://www.csse.monash.edu.au/~damian/papers/PDF/SevenDeadlySins.pdf

(My money is that it's a little of both.)

Maybe, to
get past the most amateurish level, one has to, one way or another, come
face-to-face with bits, compilers, algorithms, and all the rest that
real computer scientists learn about in their formal training...

The "No True Scotsman" fallacy.

There's nothing amateurish about building software applications that
work, with well-designed interfaces and a minimum of bugs, even if you've
never heard of Turing Machines.
 
P

Paul Moore

2009/7/8 kj said:
There is this
persistent idea "out there" that programming is a very accessible
skill, like cooking or gardening, anyone can do it, and even profit
from it, monetarily or otherwise, etc., and to some extent I am
actively contributing to this perception by teaching this course
to non-programmers (experimental biologists to be more precise),
but maybe this idea is not entirely true...  Maybe, to get past
the most amateurish level, one has to, one way or another, come
face-to-face with bits, compilers, algorithms, and all the rest
that real computer scientists learn about in their formal training...

Look at it another way. Experimental biologists don't want to program,
they want to use computers to do experimental biology. It's a tool,
and they (quite reasonably) don't *care* about robustness,
portability, etc. Or even about programming, to be honest.

In the context of the original question, it's entirely reasonable (in
my view) to tell this audience "if the code does something weird you
don't understand, either ignore it and find another way or dig into
the manuals and experiment if you care". They'd very quickly find a =
a + b as a less confusing alternative to a += b. (As has been pointed
out earlier, to some extent a += b is quite an advanced construct -
after all, it's essentially an optimisation of a = a + b).

Biologists don't expect me to understand their discipline before I can
plant seeds in my garden, after all. (And when I do plant seeds, I
usually get far more surprising results than I could get from a += b
:))

Paul.
 
N

nn

Extremely helpful.  Thanks!  (I learned more from it than someone
who will teach the stuff would care to admit...)

And it confirmed Paul Graham's often-made assertion that all of
programming language design is still catching up to Lisp...

kj

One possibility is to avoid teaching mutable datatypes at the
beginning (i.e. no lists,dicts and sets), then you would have
something more lispish. You would have to use tuples instead of lists
for everything. That avoids the gotchas of mutable types at the cost
of efficiency.

Python is not a purist language but rather walks a fine line between
high flexible abstractions and simple fast implementation.
 
D

Dave Angel

kj said:
I happen to be very receptive to this point of view. I had the
benefit of that sort of training (one of the first computer courses
I took started, believe it or not, with Turing machines, through
coding in machine language, and compiler theory, and all the way
up to dabbling with Unix!), and I suspect that the reason it is
sometimes difficult for me to explain even relatively simple-looking
things to others is that I have this background that I unconsciously,
and incorrectly, take for granted in others... There is this
persistent idea "out there" that programming is a very accessible
skill, like cooking or gardening, anyone can do it, and even profit
from it, monetarily or otherwise, etc., and to some extent I am
actively contributing to this perception by teaching this course
to non-programmers (experimental biologists to be more precise),
but maybe this idea is not entirely true... Maybe, to get past
the most amateurish level, one has to, one way or another, come
face-to-face with bits, compilers, algorithms, and all the rest
that real computer scientists learn about in their formal training...

kj
And I learned to program Analog Computers, along with APL, in an early
course. In my case assembly language came later, but by then we had
covered the electronics behind transistors, and Karnough maps, logic
diagrams, and so on. By the time I graduated, I had five six-level
languages and two assembly languages under my belt.

..DaveA
 
S

Simon Forman

(I wanted to reply to a few messages in one post so I quoted them all
below. Let me know if this is bad etiquette.)

I happen to be very receptive to this point of view. I had the
benefit of that sort of training (one of the first computer courses
I took started, believe it or not, with Turing machines, through
coding in machine language, and compiler theory, and all the way
up to dabbling with Unix!), and I suspect that the reason it is
sometimes difficult for me to explain even relatively simple-looking
things to others is that I have this background that I unconsciously,
and incorrectly, take for granted in others... There is this

Yes! Once the concepts become so familiar that you call them
"intuitive" it seems to be very difficult to remember what they were
like before.

Something like "a = b" becomes "obvious" only after you've
internalized the preceding concepts.

persistent idea "out there" that programming is a very accessible
skill, like cooking or gardening, anyone can do it, and even profit
from it, monetarily or otherwise, etc., and to some extent I am

Programming is not like any other human activity. I've been reading
some of Prof. Dijkstra's EWDs in the last few days. In one [1] he
says, "automatic computers embody not only one radical novelty but two
of them", to wit: First, the huge scales that must be understood,
"from a bit to a few hundred megabytes, from a microsecond to a half
an hour of computing"; and second, "that the automatic computer is our
first large-scale digital device" which our until-now overwhelmingly
analog experience does not prepare us to deal with well.

He talks about how "when all is said and done, the only thing
computers can do for us is to manipulate symbols and produce results
of such manipulations" and he emphasises the "uninterpreted" nature of
mechanical symbol manipulation, i.e. that the machine is doing it
mindlessly.

Dijkstra[1]: "It is true that the student that has never manipulated
uninterpreted formulae quickly realizes that he is confronted with
something totally unlike anything he has ever seen before. But
fortunately, the rules of manipulation are in this case so few and
simple that very soon thereafter he makes the exciting discovery that
he is beginning to master the use of a tool that, in all its
simplicity, gives him a power that far surpasses his wildest
dreams." [1]

actively contributing to this perception by teaching this course
to non-programmers (experimental biologists to be more precise),

Experimental biologists? Well that's probably harmless. Mostly
harmless.
but maybe this idea is not entirely true... Maybe, to get past
the most amateurish level, one has to, one way or another, come
face-to-face with bits, compilers, algorithms, and all the rest
that real computer scientists learn about in their formal training...

kj

If you're never exposed to that constellation of concepts that
underpins "mechanical symbol manipulation" you are adrift in a sea
("c", ha ha) of abstractions.

However, if you /are/ exposed to the "so few and simple" rules of
manipulation the gates (no pun intended) to the kingdom are thrown
wide.


I happen to be very receptive to this point of view. [...]
There is this persistent idea "out there" that
programming is a very accessible skill, like cooking or gardening,
anyone can do it, and even profit from it, monetarily or otherwise,
etc., and to some extent I am actively contributing to this perception
by teaching this course to non-programmers (experimental biologists to
be more precise), but maybe this idea is not entirely true...

There is some evidence that 30-60% of people simply cannot learn to
program, no matter how you teach them:

http://www.codinghorror.com/blog/archives/000635.html
http://www.cs.mdx.ac.uk/research/PhDArea/saeed/

Thank you! That's exactly the paper that prompted me to write the
article I mentioned. (Now I don't have to go find the link myself.
Win!)

I don't buy it: I believe strongly that any normal person can learn to
program, to manipulate symbols to create formulae that guide the
machine in its uninterpreted symbol manipulation.

I find it significant that in the paper [2] they say, "Formal logical
proofs, and therefore programs – formal logical proofs that particular
computations are possible, expressed in a formal system called a
programming language – are utterly meaningless. To write a computer
program you have to come to terms with this, to accept that whatever
you might want the program to mean, the machine will blindly follow
its meaningless rules and come to some meaningless conclusion. In the
test the consistent group showed a pre-acceptance of this fact: they
are capable of seeing mathematical calculation problems in terms of
rules, and can follow those rules wheresoever they may lead. The
inconsistent group, on the other hand, looks for meaning where it is
not."

In other words the people who don't understand computers, don't
understand computers.

I think that "first hump" people can become "second hump" people but
that it requires teaching them the foundations first, not confronting
them with such incredible novelties as "a = b" and saying in effect,
"here you go buddy, sink or swim."

Quoting Dijkstra again [1]: "Before we part, I would like to invite
you to consider the following way of doing justice to computing's
radical novelty in an introductory programming course.

"On the one hand, we teach what looks like the predicate calculus, but
we do it very differently from the philosophers. In order to train the
novice programmer in the manipulation of uninterpreted formulae, we
teach it more as boolean algebra, familiarizing the student with all
algebraic properties of the logical connectives. To further sever the
links to intuition, we rename the values {true, false} of the boolean
domain as {black, white}.

"On the other hand, we teach a simple, clean, imperative programming
language, with a skip and a multiple assignment as basic statements,
with a block structure for local variables, the semicolon as operator
for statement composition, a nice alternative construct, a nice
repetition and, if so desired, a procedure call. To this we add a
minimum of data types, say booleans, integers, characters and strings.
The essential thing is that, for whatever we introduce, the
corresponding semantics is defined by the proof rules that go with
it."

Imagine my surprise: he's saying (with immensely greater brilliance
and eloquence) much what I said in my little article. The major
difference from what he's outlined is that I think the students should
implement the imperative programming language themselves in Forth, but
the gist is the same.

I'm sympathetic to the idea, but not entirely convinced. Perhaps the
problem isn't with the students, but with the teachers, and the
languages:

http://www.csse.monash.edu.au/~damian/papers/PDF/SevenDeadlySins.pdf

(My money is that it's a little of both.)

Hmm, that paper contains some good insights IMO, but I think they're
still missing the big picture, so to speak.

Really I suspect it's a case of "Programming languages considered
harmful."

The core abstractions of [mechanical] computation are just not that
complicated. You can teach them to anybody in about a half an hour,
drunk. I have.

After that, if they're interested, there is a smooth easy path to
"higher" abstractions: parsing, compiling, tree traversal and
transformation. (It is said that possession is 9/10s of the law, in
the same vein I would claim parsing is 9/10s of computer programming.)

I am beginning to suspect that concrete static (in the sense of
"standard" language specifications) languages are part of the
problem. Everyone gets so caught up in programming via languages that
you get, well, people trying to teach "Computer Programming" as if it
were only necessary to grok a language, rather than grokking /symbol
manipulation/ itself.

(Did you read that last paragraph and think, "Well how the heck else
are you supposed to program a computer if not in a computer
language?"? If so, well, that is kind of my point.)

The "No True Scotsman" fallacy.

There's nothing amateurish about building software applications that
work, with well-designed interfaces and a minimum of bugs, even if you've
never heard of Turing Machines.

I beg to differ. I recall a conversation with a co-worker who had
"learned" to program using PHP. Another co-worker and I were trying
to convince him that there was a good reason to differentiate between
hash tables and arrays. He didn't even know that they were different
"things".

I remember telling him, "between the metal and the desktop there is
nothing but layers of abstraction. We use different names precisely
because of different behaviours."

He made "well-designed interfaces", but "amateurish" is about the
nicest thing I would have called him.

As for "a minimum of bugs"... sigh. The "minimum of bugs" is zero, if
you derive your "uninterpreted formulae" /correctly/. Deriving
provably correct "programs" should be what computer science and
computer education are all about (not "java vocational training" as
Alan Kay once decried.)

Again with Dijkstra[3]: "The prime paradigma of the pragmatic designer
is known as "poor man's induction", i.e. he believes in his design as
long as "it works", i.e. until faced with evidence to the contrary.
(He will then "fix the design".) The scientific designer, however,
believes in his design because he understands why it will work under
all circumstances. The transition from pragmatic to scientific design
would indeed be a drastic change within the computer industry."

"Obviously no errors" is the goal to strive for, and I am comfortable
calling anyone an amateur who prefers "no obvious errors." (Actually
that's a little harsh on the amateurs, "ama" meaning love, "amateur"
is one who does something for love of it.)


Very cool.

kj

Hey, thank you!


Look at it another way. Experimental biologists don't want to program,
they want to use computers to do experimental biology. It's a tool,
and they (quite reasonably) don't *care* about robustness,
portability, etc. Or even about programming, to be honest.

I'd say it's just the opposite: to "use computers to do experimental
biology" they want to instruct that machine to manipulate their
(meaningful to them but meaningless to it) symbols in useful ways.
This is nothing more or less than programming.

The fact that they need to learn all sorts of details of a programming
language to do that is NOT because they can't grok programming. It's
because computer scientists have put too many layers of abstraction on
top of the "pure" symbol manipulation and then forgotten what they
have done.

I have a very nice book "Introduction to Programming and Problem
Solving with Pascal" that I picked up for $0.50 at a used bookstore
not long ago. It says, right on page 201, in the chapter on "Running,
Debugging, and Testing Programs":

"One of the nice features of programming in a high-level language like
Pascal is that it can be done with almost a total lack of
understanding of what a computer is and how it actually operates.
[...] There is no reason why someone who wants to write a computer
program should have to understand the electronic circuitry of a
computer, any more than someone learning to drive a car should have to
understand how the internal combustion engine works."

I think that's exactly wrong. What you're doing with computers doesn't
change from the bit to the high-level language. It's all symbol
manipulation according to the same set of rules, all the way up. The
elaboration becomes involved as you go up but the process does not
change qualitatively.

In the context of the original question, it's entirely reasonable (in
my view) to tell this audience "if the code does something weird you
don't understand, either ignore it and find another way or dig into
the manuals and experiment if you care". They'd very quickly find a =
a + b as a less confusing alternative to a += b. (As has been pointed
out earlier, to some extent a += b is quite an advanced construct -
after all, it's essentially an optimisation of a = a + b).

On that point I completely agree. The language is getting in the way
of the programming.
Biologists don't expect me to understand their discipline before I can
plant seeds in my garden, after all. (And when I do plant seeds, I
usually get far more surprising results than I could get from a += b
:))

Paul.

The discipline of programming is different than biology. It's
incredibly simple yet profound if it's taught as what it is, i.e.
automatic symbol manipulation.

No scientist is a stranger to logic and reasoned argument. They
shouldn't be strangers to telling their mechanical brains what to
"reason" about. Learning to program should be /easy/ for someone who
basically already gets it.

Wow, long post.

(Oh, and, er, it's ONTOGENY not Ontology that recapitulates phylogeny.
Heh. My bad.)

[1] "On the cruelty of really teaching computing science"
http://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036.html


[2] "The camel has two humps"
http://www.cs.mdx.ac.uk/research/PhDArea/saeed/paper1.pdf


[3] "Can computing science save the computer industry?"
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD09xx/EWD920.html
 
D

Dave Angel

greg said:
^^^

Are they languages that you have to edit using vi? :)
Back then I didn't need glasses. That was of course intended to be
"six high-level languages"
 
B

Benjamin Kaplan

Everyone gets so caught up in programming via languages that
you get, well, people trying to teach "Computer Programming" as if it
were only necessary to grok a language, rather than grokking /symbol
manipulation/ itself.

+1 QOTW.

I'm a CS major and this is exactly what I see happening in the intro
to CS courses. In class, the "Intro to Programming" students are
showed how to write Java code. Then they try their homework and are
completely lost because they were never taught how to think like a
programmer so they don't understand how to approach the question.
 
E

Ethan Furman

Steven said:
There is some evidence that 30-60% of people simply cannot learn to
program, no matter how you teach them:

http://www.codinghorror.com/blog/archives/000635.html
http://www.cs.mdx.ac.uk/research/PhDArea/saeed/

I'm sympathetic to the idea, but not entirely convinced. Perhaps the
problem isn't with the students, but with the teachers, and the
languages:

http://www.csse.monash.edu.au/~damian/papers/PDF/SevenDeadlySins.pdf

(My money is that it's a little of both.)

Very interesting articles. Not suprising (to me, at least) -- everybody
has their strengths and weaknesses. I will never be, and could never
have been, any kind of sports person; I don't have the right
personality to be a good people person; I don't have the knack for
writing stories. Not everyone can do anything, or even many things.

~Ethan~
 
S

Steven D'Aprano

The core abstractions of [mechanical] computation are just not that
complicated. You can teach them to anybody in about a half an hour,
drunk. I have.

That's *easy*. Anyone can teach the most complicated and abstract
principles of any topic at all drunk. What's hard is doing it sober.

http://stackoverflow.com/questions/63668/confessions-of-your-worst-wtf-
moment-what-not-to-do/350267#350267

or http://tinyurl.com/lur784


You'll excuse my skepticism about all these claims about how anyone can
program, how easy it is to teach the fundamentals of Turing Machines and
functional programming to anybody at all. Prove it. Where are your peer-
reviewed studies demonstrating such successes on randomly chosen people,
not self-selected motivated, higher-than-average intelligence students?

In the real world, human beings are prone to serious cognitive biases and
errors. Our brains are buggy. Take any list of reasoning fallacies, and
you'll find the majority of people have difficulty avoid them. The first
struggle is to get them to even accept that they *are* fallacies, but
even once they have intellectually accepted it, getting them to live up
to that knowledge in practice is much, much harder.

In fact I'd go further and say that *every single person*, no matter how
intelligent and well-educated, will fall for cognitive biases and
fallacious reasoning on a regular basis.

http://en.wikipedia.org/wiki/Cognitive_bias
http://en.wikipedia.org/wiki/Fallacy
 
S

Steven D'Aprano

Programming is not like any other human activity.

In practice? In principle? Programming in principle is not the same as it
is performed in practice.

But in either case, programming requires both the logical reasoning of
mathematics and the creativity of the arts. Funnily enough,
mathematicians will tell you that mathematics requires the same, and so
will the best artists. I think mathematicians, engineers, artists, even
great chefs, will pour scorn on your claim that programming is not like
any other human activity.


[...]
He talks about how "when all is said and done, the only thing computers
can do for us is to manipulate symbols and produce results of such
manipulations" and he emphasises the "uninterpreted" nature of
mechanical symbol manipulation, i.e. that the machine is doing it
mindlessly.

"Manipulate symbols" is so abstract as to be pointless. By that
reasoning, I can build a "computer" consisting of a box open at the top.
I represent a symbol by an object (say, a helium-filled balloon, or a
stone), instead of a pattern of bits. I manipulate the symbol by holding
the object over the box and letting go. If it flies up into the sky, that
represents the symbol "Love is War", if it falls into the box, it
represents the symbol "Strength is Blue", and if it just floats there, it
represents "Cheddar Cheese". This is a deterministic, analog computer
which manipulates symbols. Great.

And utterly, utterly useless. So what is my computer lacking that real
computers have? When you have answered that question, you'll see why
Dijkstra's claim is under-specified.


Dijkstra[1]: "It is true that the student that has never manipulated
uninterpreted formulae quickly realizes that he is confronted with
something totally unlike anything he has ever seen before. But
fortunately, the rules of manipulation are in this case so few and
simple that very soon thereafter he makes the exciting discovery that he
is beginning to master the use of a tool that, in all its simplicity,
gives him a power that far surpasses his wildest dreams." [1]

What is an uninterpreted formula? If you don't interpret it, how can you
distinguish it from random noise?

If you're never exposed to that constellation of concepts that underpins
"mechanical symbol manipulation" you are adrift in a sea ("c", ha ha) of
abstractions.

However, if you /are/ exposed to the "so few and simple" rules of
manipulation the gates (no pun intended) to the kingdom are thrown wide.


What nonsense. The keys to the kingdom are the abstractions.

Here's an exercise for you, if you dare. It's not that difficult to
remove the abstraction from integer addition, to explain it in terms of
bit manipulation. If you prefer, you can write a Turing Machine instead.

Now try doing the same thing for Quicksort. Don't worry, we'll wait.

Once you've done that, let's see you implement a practical, scalable, non-
toy webserver as a Turing Machine.

No, it's not the fundamental operations that open the doors to the
kingdom. It's the abstractions. The history of computing is to gain more
and more power by leveraging better and better abstractions.


On Jul 8, 9:10 am, Steven D'Aprano <st...@REMOVE-THIS-
cybersource.com.au> wrote: ....
I don't buy it:


I'm not entirely convinced myself. It's plausible -- not everyone has got
what it takes to be musically talented, or a great athlete, or a skilled
surgeon, or a charismatic leader. Why should programming be the one high-
level intellectual skill that everybody is good at? Seems mighty
implausible to me.

But I'm not sure that the distribution of skill is *fundamentally* bi-
modal. Maybe it is, maybe it isn't.

I believe strongly that any normal person can learn to
program, to manipulate symbols to create formulae that guide the machine
in its uninterpreted symbol manipulation.

Most people don't manipulate abstract symbols *at all* -- even something
as simple as:

"if x is to z as half y is to twice z, then x is the same as ____ y"
(fill in the blank)

will perplex most people.

I think that "first hump" people can become "second hump" people but
that it requires teaching them the foundations first, not confronting
them with such incredible novelties as "a = b" and saying in effect,
"here you go buddy, sink or swim."

The weakness of the paper is not that "a=b" is a novelty, but that it's
dangerously familiar. All their students will have already seen "a=b" in
the context of many years of mathematics. But that's not what it means in
this context. So the risk is, they will be interpreting the symbols "a=b"
in terms of mathematical equality, and getting confused.

Quoting Dijkstra again [1]: "Before we part, I would like to invite you
to consider the following way of doing justice to computing's radical
novelty in an introductory programming course.

"On the one hand, we teach what looks like the predicate calculus, but
we do it very differently from the philosophers. In order to train the
novice programmer in the manipulation of uninterpreted formulae, we
teach it more as boolean algebra, familiarizing the student with all
algebraic properties of the logical connectives. To further sever the
links to intuition, we rename the values {true, false} of the boolean
domain as {black, white}.

This is supposed to be a good thing?

Oh my. It must be nice in that ivory tower, never having to deal with
anyone who hasn't already been winnowed by years of study and self-
selection before taking on a comp sci course.

I don't think Dijkstra would have taken that suggestion seriously if he
had to teach school kids instead of college adults.


....
(Did you read that last paragraph and think, "Well how the heck else are
you supposed to program a computer if not in a computer language?"? If
so, well, that is kind of my point.)

No. What I thought was, how the hell are you supposed to manipulate
symbols without a language to describe what sort of manipulations you can
do?

I beg to differ. I recall a conversation with a co-worker who had
"learned" to program using PHP. Another co-worker and I were trying to
convince him that there was a good reason to differentiate between hash
tables and arrays. He didn't even know that they were different
"things".

I shall manfully resist the urge to make sarcastic comments about
ignorant PHP developers, and instead refer you to my earlier reply to you
about cognitive biases. For extra points, can you identify all the
fallacious reasoning in that paragraph of yours?

I remember telling him, "between the metal and the desktop there is
nothing but layers of abstraction. We use different names precisely
because of different behaviours."

But those abstractions just get in the way, according to you. He should
be programming in the bare metal, doing pure symbol manipulation without
language.

He made "well-designed interfaces", but "amateurish" is about the nicest
thing I would have called him.

Well, perhaps this is a failure of his profession, in that there isn't
enough specialisation. If he is skilled with making interfaces, and
unskilled with programming the backend, then he should work where his
strength lies, instead of being insulted by somebody who has an entirely
unjustified opinion about what "real programming" is.

If programming is symbol manipulation, then you should remember that the
user interface is also symbol manipulation, and it is a MUCH harder
problem than databases, sorting, searching, and all the other problems
you learn about in academia. The user interface has to communicate over a
rich but noisy channel using multiple under-specified protocols to a
couple of pounds of meat which processes information using buggy
heuristics evolved over millions of years to find the ripe fruit, avoid
being eaten, and have sex. If you think getting XML was hard, that's
*nothing* compared to user interfaces.

The fact that even bad UIs work at all is a credit to the heuristics,
bugs and all, in the meat.

As for "a minimum of bugs"... sigh. The "minimum of bugs" is zero, if
you derive your "uninterpreted formulae" /correctly/.

And the secret of getting rich on the stock market is to buy low, sell
high. Very true, and very pointless. How do you derive the formula
correctly?

Do you have an algorithm to do so? If so, how do you know that algorithm
is correct?

Deriving provably
correct "programs" should be what computer science and computer
education are all about

That's impossible. Not just impractical, or hard, or subject to hardware
failures, but impossible.

I really shouldn't need to raise this to anyone with pretensions of being
a Computer Scientist and not just a programmer, but... try proving an
arbitrary program will *halt*.

If you can't do that, then what makes you think you can prove all useful,
necessary programs are correct?


Again with Dijkstra[3]: "The prime paradigma of the pragmatic designer
is known as "poor man's induction", i.e. he believes in his design as
long as "it works", i.e. until faced with evidence to the contrary. (He
will then "fix the design".)

Yes, that is the only way it can be.

The scientific designer, however, believes
in his design because he understands why it will work under all
circumstances.

Really? Under *all* circumstances?

Memory failures? Stray cosmic radiation flipping bits? Noise in the power
supply?

What's that you say? "That's cheating, the software designer can't be
expected to deal with the actual reality of computers!!! We work in an
abstract realm where computers never run out of memory, swapping never
occurs, and the floating point unit is always correct!"

Fail enough. I don't want to make it too hard for you.

Okay, so let's compare the mere "pragmatic designer" with his provisional
expectation that his code is correct, and Dijkstra's "scientific
designer". He understands his code is correct. Wonderful!

Er, *how exactly* does he reach that understanding?

He reasons about the logic of the program. Great! I can do that. See,
here's a proof that binary search is correct. How easy this is.

Now try it for some non-trivial piece of code. Say, the Linux kernel.
This may take a while...

What guarantee do we have that the scientific designer's reasoning was
correct? I mean, sure he is convinced he has reasoned correctly, but he
would, wouldn't he? If he was unsure, he'd keep reasoning (or ask for
help), until he was sure. But being sure doesn't make him correct. It
just makes him sure.

So how do you verify the verification?

And speaking of binary search:

I was shocked to learn that the binary search program that Bentley PROVED
CORRECT and SUBSEQUENTLY TESTED [emphasis added] in Chapter 5 of
"Programming Pearls" contains a bug. Once I tell you what the it is, you
will understand why it escaped detection for two decades.
[end quote]

http://northernplanets.blogspot.com/2006/07/nearly-all-binary-searches-
are-broken.html

or http://tinyurl.com/nco6yv

Perhaps the "scientific designer" should be a little less arrogant, and
take note of the lesson of real science: all knowledge is provisional and
subject to correction and falsification.

Which really makes the "scientific designer" just a slightly more clever
and effective "pragmatic designer", doesn't it?

The transition from pragmatic to scientific design would
indeed be a drastic change within the computer industry."

Given Dijkstra's definition of "scientific design", it certainly would.
It would be as drastic a change as the discovery of Immovable Objects and
Irresistible Forces would be to engineering, or the invention of a
Universal Solvent for chemistry, or any other impossibility.

"Obviously no errors" is the goal to strive for, and I am comfortable
calling anyone an amateur who prefers "no obvious errors."

It's not a matter of preferring no obvious errors, as understanding the
limitations of reality. You simply can't do better than no obvious errors
(although of course you can do worse) -- the only question is what you
call "obvious".
 
J

J. Cliff Dyer

If programming is symbol manipulation, then you should remember that
the
user interface is also symbol manipulation, and it is a MUCH harder
problem than databases, sorting, searching, and all the other
problems
you learn about in academia. The user interface has to communicate
over a
rich but noisy channel using multiple under-specified protocols to a
couple of pounds of meat which processes information using buggy
heuristics evolved over millions of years to find the ripe fruit,
avoid
being eaten, and have sex. If you think getting XML was hard, that's
*nothing* compared to user interfaces.

+1 QOTW! QOTM, even!
 
T

Terry Reedy

Steven said:
And speaking of binary search:

I was shocked to learn that the binary search program that Bentley PROVED
CORRECT and SUBSEQUENTLY TESTED [emphasis added] in Chapter 5 of
"Programming Pearls" contains a bug. Once I tell you what the it is, you
will understand why it escaped detection for two decades.
[end quote]

http://northernplanets.blogspot.com/2006/07/nearly-all-binary-searches-are-broken.html

or http://tinyurl.com/nco6yv

The actual report is at
http://googleresearch.blogspot.com/2006/06/extra-extra-read-all-about-it-nearly.html

The is the so called bug:
"In Programming Pearls Bentley says that the analogous line "sets m to
the average of l and u, truncated down to the nearest integer." On the
face of it, this assertion might appear correct, but it fails for large
values of the int variables low and high. Specifically, it fails if the
sum of low and high is greater than the maximum positive int value (231
- 1). The sum overflows to a negative value, and the value stays
negative when divided by two. In C this causes an array index out of
bounds with unpredictable results. In Java, it throws
ArrayIndexOutOfBoundsException."

The is *not* a bug is Bentley program. It is a bug in bad, buggy, insane
integer arithmetic implementations. low + high should either return the
right answer, as it will in Python, or raise an overflow error.

tjr
 
S

Steven D'Aprano

Steven said:
And speaking of binary search:

I was shocked to learn that the binary search program that Bentley
PROVED CORRECT and SUBSEQUENTLY TESTED [emphasis added] in Chapter 5 of
"Programming Pearls" contains a bug. Once I tell you what the it is,
you will understand why it escaped detection for two decades. [end
quote]

http://northernplanets.blogspot.com/2006/07/nearly-all-binary-searches- are-broken.html

or http://tinyurl.com/nco6yv

The actual report is at
http://googleresearch.blogspot.com/2006/06/extra-extra-read-all-about- it-nearly.html

The is the so called bug:
"In Programming Pearls Bentley says that the analogous line "sets m to
the average of l and u, truncated down to the nearest integer." On the
face of it, this assertion might appear correct, but it fails for large
values of the int variables low and high. Specifically, it fails if the
sum of low and high is greater than the maximum positive int value (231
- 1). The sum overflows to a negative value, and the value stays
negative when divided by two. In C this causes an array index out of
bounds with unpredictable results. In Java, it throws
ArrayIndexOutOfBoundsException."

The is *not* a bug is Bentley program.

Wow. That's an impressive set of typos :)

It is a bug in bad, buggy, insane
integer arithmetic implementations. low + high should either return the
right answer, as it will in Python, or raise an overflow error.

Is binary search *supposed* to raise an overflow error if given more than
2**30 elements? If not, then you're solving a different problem: it's
just like binary search, but it only works up to a maximum number of
elements. Now perhaps that's a legitimate problem to solve, but unless
you make the change in behaviour clear up front, your code has failed to
live up to specifications and therefore is buggy.

Is there something special about OverflowError that is "better" than
ArrayIndexOutOfBoundsException?

I don't have "Programming Pearls", and unfortunately the section on
binary search is not online:

http://netlib.bell-labs.com/cm/cs/pearls/sketch04.html

but there's no reason to imagine that the book will assume -- or even
state as a requirement for binary search -- that addition will never
overflow. Far from it: it seems to me that the author is very aware of
real world constraints, and in the early 1980s, BigInt types were not
something you took as a given.

Of course *binary search* as an algorithm is not broken. The bug was that
"Programming Pearls" neglected to take into account overflow when you
have more than 2**30 items. One way to take that into account is to
insist on using a language like Python where addition can never overflow.
But that's not the code presented in "Programming Pearls".
 
H

Hendrik van Rooyen

Steven D'Aprano said:
In practice? In principle? Programming in principle is not the same as it
is performed in practice.

But in either case, programming requires both the logical reasoning of
mathematics and the creativity of the arts. Funnily enough,

I do not buy this arty creativity stuff. - or are you talking about
making a website look pretty?
mathematicians will tell you that mathematics requires the same, and so
will the best artists. I think mathematicians, engineers, artists, even
great chefs, will pour scorn on your claim that programming is not like
any other human activity.

So a chef is now an authority on programming?

Programming is actually kind of different - almost everything else is
just done, at the time that you do it.

Programming is creating stuff that is completely useless until it is
fed into something that uses it, to do something else, in conjuction
with the thing it is fed into, at a later time.

This is a highly significant difference, IMHO.

[...]
He talks about how "when all is said and done, the only thing computers
can do for us is to manipulate symbols and produce results of such
manipulations" and he emphasises the "uninterpreted" nature of
mechanical symbol manipulation, i.e. that the machine is doing it
mindlessly.

"Manipulate symbols" is so abstract as to be pointless. By that
reasoning, I can build a "computer" consisting of a box open at the top.
I represent a symbol by an object (say, a helium-filled balloon, or a
stone), instead of a pattern of bits. I manipulate the symbol by holding
the object over the box and letting go. If it flies up into the sky, that
represents the symbol "Love is War", if it falls into the box, it
represents the symbol "Strength is Blue", and if it just floats there, it
represents "Cheddar Cheese". This is a deterministic, analog computer
which manipulates symbols. Great.

And utterly, utterly useless. So what is my computer lacking that real
computers have? When you have answered that question, you'll see why
Dijkstra's claim is under-specified.

So if computers do not manipulate symbols, what is it that they do?
They sure cannot think,
or drink,
or reason,
or almost any verb you can think of.

"Manipulating symbols" is actually an elegant definition.
Try coming up with a better one and you will see.

And calling an abstraction pointless kind of contradicts
what you say later...

8<------- camel humps and other stuff -------------------

- Hendrik
 
S

Steven D'Aprano

I do not buy this arty creativity stuff. - or are you talking about
making a website look pretty?

I must admit, it never crossed my mind that anyone here would claim that
there was no creativity involved in programming, that it was all a
mindless, algorithmic process capable of being done by a simple
mechanical device.

This is certainly the accusation made against *bad* programmers -- that
they can't actually solve new, unique problems, but just apply recipes
they learned without any insight or intelligence. The sort of people who
program so poorly that "a trained monkey could do what they do".

Do you really think that applies to good programmers too? If so, then a
good code generator should be able to replace any programmer. Is that
what you believe?


So a chef is now an authority on programming?

Did I say that?

Chefs are authorities on OTHER HUMAN ACTIVITIES.


Programming is actually kind of different - almost everything else is
just done, at the time that you do it.

Programming is creating stuff that is completely useless until it is fed
into something that uses it, to do something else, in conjuction with
the thing it is fed into, at a later time.

Somebody should teach Hendrik that human beings have been creating TOOLS
for hundreds of thousands of years. People have been creating tools to
build tools for thousands of years. Software is just more of the same.

Even *soup stock* fits the same profile as what Hendrik claims is almost
unique to programming. On its own, soup stock is totally useless. But you
make it, now, so you can you feed it into something else later on.

Or instant coffee.

No, Henrik, if that's the best you can do, it's not very good. It is
rather sad, but also hilarious, that the most different thing you have
noticed about software is that it's just like instant coffee.

This is a highly significant difference, IMHO.
[...]
He talks about how "when all is said and done, the only thing
computers can do for us is to manipulate symbols and produce results
of such manipulations" and he emphasises the "uninterpreted" nature of
mechanical symbol manipulation, i.e. that the machine is doing it
mindlessly.

"Manipulate symbols" is so abstract as to be pointless. By that
reasoning, I can build a "computer" consisting of a box open at the top.
I represent a symbol by an object (say, a helium-filled balloon, or a
stone), instead of a pattern of bits. I manipulate the symbol by holding
the object over the box and letting go. If it flies up into the sky,
that represents the symbol "Love is War", if it falls into the box, it
represents the symbol "Strength is Blue", and if it just floats there,
it represents "Cheddar Cheese". This is a deterministic, analog computer
which manipulates symbols. Great.

And utterly, utterly useless. So what is my computer lacking that real
computers have? When you have answered that question, you'll see why
Dijkstra's claim is under-specified.
So if computers do not manipulate symbols, what is it that they do?

Did I say they don't manipulate symbols?

They
sure cannot think,
or drink,
or reason,

They can't reason? Then what are they doing when they manipulate symbols?

Yet again, it didn't even cross my mind that somebody would make this
claim. My entire point is that it's not enough to just "manipulate
symbols", you have to manipulate symbols the correct way, following laws
of logic, so that the computer can *mindlessly* reason.

or almost any verb you can think of.

If you're going to take that argument, then I'll remind you that there
are no symbols inside a computer. There are only bits. And in fact, there
aren't even any bits -- there are only analog voltages, and analog
magnetic fields.

"Manipulating symbols" is actually an elegant definition. Try coming up
with a better one and you will see.

As I said above, it's not enough to just manipulate symbols. Here's a set
of rules to manipulate symbols:

X => 0

or in English, "Any symbol becomes the zero symbol".

That's symbol manipulation. Utterly useless. This is why it's not enough
to just manipulate symbols, you have to manipulate them THE RIGHT WAY.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,201
Messages
2,571,049
Members
47,654
Latest member
LannySinge

Latest Threads

Top