* James Kanze:
To write correct programs, you have to understand what's going
on before the code is written. You can't use a debugger on code
that hasn't been written yet.
Hm, I think you have too compentent people around you... ;-)
Fact is, most students (unless they've been taught by someone a bit
unconventional like me) have only the vaguest grasp of what various
program constructs do, what's really going on, because many and perhaps
most students are taught by the monkey-see-monkey-do principle where
patterns they see are repeated and slightly adapted to new
circumstances, without any real understanding.
In first-year of computer science, students benefit a lot from seeing an
execution position actually jump around in a loop, seeing the effects on
variables, seeing that there is a call stack, and so on. That also
holds for graduates starting in a company, because they're not likely to
have experienced that. I guess that you would never dream of placing a
1 MB object in a local variable except if you had a very good reason for
doing that and knew the exact circumstances under which that code would
be executed; a lot of students and fresh employees do such things
unthinkingly, because they haven't experienced any problem with it.
Debuggers can be useful for understanding poorly written and
poorly documented existing code.
That's most code in existence...
Or rather, :-(.
But that's not normally what
students are concerned with. They're supposed to be learning
how to write good code; by banning the debugger, you remove the
crutch which allows them to get by without understanding what
they are doing.
On the contrary, a debugger allows them to understand what they're
doing, at a much deeper and more concrete level than book lernin'.
Actually, I meant by students. Professionals use them mostly
for legacy code, or code from some outside source, which isn't
correctly documented or cleanly written. They can be of
enormous help there. The professionals I know don't use them at
all for the code they write themselves. (In many of the places
I've worked, there hasn't even been a debugger available. And
nobody missed them.)
Again, I think you have too compentent people around you... ;-)
Lucky you.
If you have to check and figure out documentation, a debugger
can be a valuable asset. Even gdb. But I think you'd agree
that if the documentation requires checking and figuring out,
that's a problem in itself.
Oh yes, it is a problem.
Unfortunately that's generally the case with Microsoft's documentation.
Microsoft is a big company that makes a lot of libraries that people use
in their real-world programs.
Then there's a ditto company called Sun.
Not to mention one called IBM (although IBM is generally very very
methodical, that doesn't help when the method is all at the wrong level
of abstraction).
And for contemporary teaching I guess Google is unavoidable.
Although I have next to no experience using Google's public libraries
(they're mostly JavaScript), it is telling that most everything is
designated "beta", so I suspect that if I sat down and made a forensic
map application that I thought of yesterday, then I would be spending
quite some time using various debugging techniques to teach myself the
ins and outs of those libraries, and what's reality versus doc.
We certainly don't want to teach
students that they should write documentation that needs
checking and figuring out, or even that they can get away with
it. It's precisely because this is the major use of a debugger
that I would generally ban its use by students.
Ah, well, the days are past when students wrote only small programs that
relied on nothing else than Pascal's built-in Read and Write. But even
there debuggers were a great boon to understanding.
Cheers, & hth.,
- Alf