What is different with Python ?

A

Andrea Griffini

I think new CS students have more than enough to learn with their
*first* language without having to discover the trials and tribulations
of memory management (or those other things that Python hides so well).

I'm not sure that postponing learning what memory
is, what a pointer is and others "bare metal"
problems is a good idea. Those concept are not
"more complex" at all, they're just more *concrete*
than the abstract concept of "variable".
Human mind work best moving from the concrete to
the abstract, we first learn counting, and only
later we learn rings (or even set theory).
Unless you think a programmer may live happy
without understanding concrete issues then IMO
the best is to learn concrete facts first, and
only later abstractions.
I think that for a programmer skipping the
understanding of the implementation is just
impossible: if you don't understand how a
computer works you're going to write pretty
silly programs. Note that I'm not saying that
one should understand every possible implementation
down to the bit (that's of course nonsense), but
there should be no room for "magic" in a computer
for a professional programmer.

Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.

Andrea
 
R

Roy Smith

Andrea Griffini said:
I think that for a programmer skipping the
understanding of the implementation is just
impossible: if you don't understand how a
computer works you're going to write pretty
silly programs. Note that I'm not saying that
one should understand every possible implementation
down to the bit (that's of course nonsense), but
there should be no room for "magic" in a computer
for a professional programmer.

How far down do you have to go? What makes bytes of memory, data busses,
and CPUs the right level of abstraction?

Why shouldn't first-year CS students study "how a computer works" at the
level of individual logic gates? After all, if you don't know how gates
work, things like address bus decoders, ALUs, register files, and the like
are all just magic (which you claim there is no room for).

Digging down a little deeper, a NAND gate is magic if you don't know how a
transistor works or can't do basic circuit analysis. And transistors are
magic until you dig down to the truly magical stuff that's going on with
charge carriers and electric fields inside a semiconductor junction.
That's about where my brain starts to hurt, but it's also where the quantum
mechanics are just getting warmed up.
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.

At some point, you need to draw a line in the sand (so to speak) and say,
"I understand everything down to *here* and can do cool stuff with that
knowledge. Below that, I'm willing to take on faith". I suspect you would
agree that's true, even if we don't agree just where the line should be
drawn. You seem to feel that the level of abstraction exposed by a
language like C is the right level. I'm not convinced you need to go that
far down. I'm certainly not convinced you need to start there.
 
M

Mike Meyer

Andrea Griffini said:
Also concrete->abstract shows a clear path; starting
in the middle and looking both up (to higher
abstractions) and down (to the implementation
details) is IMO much more confusing.

So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.

Admittedly, at some level the details simply stop mattering. But where
that level is depends on what level you're working on. Writing Python,
I really don't need to understand the behavior of hardware
gates. Writing horizontal microcode, I'm totally f*cked if I don't
understand the behavior of hardware gates.

In short, you're going to start in the middle. You can avoid looking
down if you avoid certain classes of problems - but not everyone will
be able to do that. Since you can only protect some of the students
from this extra confusion, is it really justified to confuse them all
by introducing what are really extraneous details early on?

You've stated your opinion. Personally, I agree with Abelson, Sussman
and Sussman, whose text "The Structure and Interpretation of Computer
Programs" was the standard text at one of the premiere engineering
schools in the world, and is widely regarded as a classic in the
field: they decided to start with the abstract, and deal with concrete
issues - like assignment(!) later.

<mike

*) "My favorite programming langauge is solder." - Bob Pease
 
P

Peter Hansen

Mike said:
So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)?

No, Andrea means you need to learn physics, starting perhaps with basic
quantum mechanics and perhaps with some chemistry thrown in (since you
can't really understand semiconductors without understanding how they're
built, right?). Oh, and manufacturing. And a fundamental understanding
of scanning electron microscopes (for inspection) would be helpful as
well. I think probably a Ph.D. level training in mathematics might be a
good start also, since after all this is the foundation of much of
computing. A while later comes the electronics, and then memory management.

Things like while loops and if statements, and *how to actually write a
program* are, of course, only the eventual outcome of all that good
grounding in "the basics" that you need first.

<big wink>

-Peter
 
P

Peter Hansen

Andrea said:
I'm not sure that postponing learning what memory
is, what a pointer is and others "bare metal"
problems is a good idea. ...
I think that for a programmer skipping the
understanding of the implementation is just
impossible: if you don't understand how a
computer works you're going to write pretty
silly programs.

I'm curious how you learned to program. What path worked for you, and
do you think it was a wrong approach, or the right one?

In my case, I started with BASIC. Good old BASIC, with no memory
management to worry about, no pointers, no "concrete" details, just FOR
loops and variables and lots of PRINT statements.

A while (some months) later I stumbled across some assembly language and
-- typing it into the computer like a monkey, with no idea what I was
dealing with -- began learning about some of the more concrete aspects
of computers.

This worked very well in my case, and I strongly doubt I would have
stayed interested in an approach that started with talk of memory
addressing, bits and bytes, registers and opcodes and such.

I won't say that I'm certain about any of this, but I have a very strong
suspicion that the *best* first step in learning programming is a
program very much like the following, which I'm pretty sure was mine:

10 FOR A=1 TO 10: PRINT"Peter is great!": END

And no, I don't recall enough BASIC syntax to be sure that's even
correct, but I'm sure you get my point. In one line I learned
(implicitly at first) about variables, control structures and iteration,
output, and probably a few other things.

More importantly by far, *I made the computer do something*. This
should be everyone's first step in a programming course, and it doesn't
take the slightest understanding of what you call "concrete" things...
(though I'd call these things very concrete, and memory management
"esoteric" or something).

If I had been stuck in a course that made me learn about memory
management before I could write a program, I'm pretty sure I'd be doing
something fascinating like selling jeans in a Levis store...

-Peter
 
G

George Sakkis

Mike Meyer said:
So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.

This may sound as a rhetorical question, but in fact as an Informatics
undergrad I had to take courses in electronics, logic design, signals
and systems and other obscure courses as far CS is concerned
(http://www2.di.uoa.gr/en/lessons.php). Although these are certainly
useful if one is interested in hardware, architecture, realtime and
embedded systems, etc., I hardly find them relevant (or even more,
necessary) for most CS/IT careers. Separation of concerns works pretty
well for most practical purposes.

George
 
A

Andrea Griffini

How far down do you have to go? What makes bytes of memory, data busses,
and CPUs the right level of abstraction?

They're things that can be IMO genuinely accept
as "obvious". Even "counting" is not the lowest
level in mathematic... there is the mathematic
philosohy direction. From "counting" you can go
"up" in the construction direction (rationals,
reals, functions, continuity and the whole
analysis area) building on the counting concept
or you can go "down" asking yourself what it
does really mean counting, what do you mean
with a "proof", what really is a "set".
However the "counting" is naturally considered
obvious for our minds and you can build the
whole life without the need to look at lower
levels and without getting bitten too badly for
that simplification.

Also lower than memory and data bus there is
of course more stuff (in our universe looks
like there is *always* more stuff no mattere
where you look :) ), but I would say it's
more about electronic than computer science.
Why shouldn't first-year CS students study "how a computer works" at the
level of individual logic gates? After all, if you don't know how gates
work, things like address bus decoders, ALUs, register files, and the like
are all just magic (which you claim there is no room for).

It's magic if I'm curious but you can't answer
my questions. It's magic if I've to memorize
because I'm not *allowed* to understand.
It's not magic if I can (and naturally do) just
ignore it because I can accept it. It's not
magic if I don't have questions because it's
for me "obvious" enough.
At some point, you need to draw a line in the sand (so to speak) and say,
"I understand everything down to *here* and can do cool stuff with that
knowledge. Below that, I'm willing to take on faith". I suspect you would
agree that's true, even if we don't agree just where the line should be
drawn. You seem to feel that the level of abstraction exposed by a
language like C is the right level. I'm not convinced you need to go that
far down. I'm certainly not convinced you need to start there.

I think that if you don't understand memory,
addresses and allocation and deallocation, or
(roughly) how an hard disk works and what's
the difference between hard disks and RAM then
you're going to be a horrible programmer.

There's no way you will remember what is O(n),
what O(1) and what is O(log(n)) among containers
unless you roughly understand how it works.
If those are magic formulas you'll just forget
them and you'll end up writing code that is
thousands times slower than necessary.

If you don't understand *why* "C" needs malloc
then you'll forget about allocating objects.

Andrea
 
A

Andrea Griffini

So you're arguing that a CS major should start by learning electronics
fundamentals, how gates work, and how to design hardware(*)? Because
that's what the concrete level *really* is. Start anywhere above that,
and you wind up needing to look both ways.

Not really. Long ago I've drawn a line that starts at
software. I think you can be a reasonable programmer
even without the knowledge about how to design hardware.
I do not think you can be a reasonable programmer if
you never saw assembler.
Admittedly, at some level the details simply stop mattering. But where
that level is depends on what level you're working on. Writing Python,
I really don't need to understand the behavior of hardware
gates. Writing horizontal microcode, I'm totally f*cked if I don't
understand the behavior of hardware gates.

But you better understand how, more or less, your
computer or language works, otherwise your code will
be needless thousand times slower and will require
thousand times more memory than is necessary.
Look a recent thread where someone was asking why
python was so slow (and the code contained stuff
like "if x in range(low, high):" in an inner loop
that was itself pointless).
In short, you're going to start in the middle.

I've got "bad" news for you. You're always in the
middle :-D. Apparently it looks like this is a
constant in our universe. Even counting (i.e.
1, 2, 3, ...) is not the "start" of math (you
can go at "lower" levels).
Actually I think this is a "nice" property of our
universe, but discussing this would bring the
discussion a bit OT.
Is it really justified to confuse them all
by introducing what are really extraneous details early on?

I simply say that you will not able to avoid
introducing them. If they're going to write software
those are not "details" that you'll be able to hide
behind a nice and perfect virtual world (this is much
less true about bus cycles... at least for many
programmers).

But if you need to introduce them, then IMO is
way better doing it *first*, because that is the
way that our brain works.

You cannot build on loosely placed bricks.
You've stated your opinion. Personally, I agree with Abelson, Sussman
and Sussman, whose text "The Structure and Interpretation of Computer
Programs" was the standard text at one of the premiere engineering
schools in the world, and is widely regarded as a classic in the
field: they decided to start with the abstract, and deal with concrete
issues - like assignment(!) later.

Sure. I know that many think that starting from
higher levels is better. However no explanation is
given about *why* this should work better, and I
didn't even see objective studies about how this
approach pays off. This is of course not a field
that I've investigated a lot.

What I know is that every single competent programmer
I know (not many... just *EVERY SINGLE ONE*) started
by placing firmly concrete concepts first, and then
moved on higher abstractions (for example like
structured programming, OOP, functional languages ...).

Andrea
 
A

Andrea Griffini

I'm curious how you learned to program.

An HP RPN calculator, later TI-57. Later Apple ][.
With Apple ][ after about one afternoon spent typing
in a basic program from a magazine I gave up with
basic and started with 6502 assembler ("call -151"
was always how I started my computer sessions).
What path worked for you, and do you think it was
a wrong approach, or the right one?

I was a fourteen with no instructor, when home
computers in my city could be counted on the fingers
of one hand. Having an instructor I suppose would
have made me going incredibly faster. Knowing better
the english language at that time would have made
my life also a lot easier.
I think that anyway it was the right approach in
terms of "path", not the (minimal energy) approach
in terms of method. Surely a lower energy one in
the long run comparing to those that started with
basic and never looked at lower levels.
In my case, I started with BASIC. Good old BASIC, with no memory
management to worry about, no pointers, no "concrete" details, just FOR
loops and variables and lots of PRINT statements.

That's good as an appetizer.
A while (some months) later I stumbled across some assembly language and
-- typing it into the computer like a monkey, with no idea what I was
dealing with -- began learning about some of the more concrete aspects
of computers.

That is IMO a very good starting point. Basically it
was the same I used.
This worked very well in my case, and I strongly doubt I would have
stayed interested in an approach that started with talk of memory
addressing, bits and bytes, registers and opcodes and such.

I think that getting interested in *programming* is
important... it's like building with LEGOs, but at a
logical level. However that is just to get interest...
and a few months with basic is IMO probably too much.
But after you've a target (making computers do what
you want) then you've to start placing solid bricks,
and that is IMO assembler. Note that I think that any
simple assembler is OK... even if you'll end up using
a different processor when working in C it will be
roughly ok. But I see a difference between those that
never (really) saw assembler and those that did.
I won't say that I'm certain about any of this, but I have a very strong
suspicion that the *best* first step in learning programming is a
program very much like the following, which I'm pretty sure was mine:

10 FOR A=1 TO 10: PRINT"Peter is great!": END

Just as a motivation. After that *FORGETTING* that
(for and the "next" you missed) is IMO perfectly ok.
More importantly by far, *I made the computer do something*.

Yes, I agree. But starting from basic and never looking
lower is quit a different idea.

Andrea
 
S

Steven D'Aprano

At some point, you need to draw a line in the sand (so to speak) and say,
"I understand everything down to *here* and can do cool stuff with that
knowledge. Below that, I'm willing to take on faith". I suspect you would
agree that's true, even if we don't agree just where the line should be
drawn. You seem to feel that the level of abstraction exposed by a
language like C is the right level. I'm not convinced you need to go that
far down. I'm certainly not convinced you need to start there.

The important question is, what are the consequences of that faith when it
is mistaken?

As a Python developer, I probably won't write better code if I understand
how NAND gates work or the quantum mechanics of electrons in solid
crystals. But I will write better code if I understand how Python
implements string concatenation, implicit conversion from ints to longs,
floating point issues, etc.

It seems that hardly a day goes by without some newbie writing to the
newsgroup complaining that "Python has a bug" because they have discovered
that the floating point representation of 0.1 in decimal is actually more
like 0.10000000000000001. And let's not forget the number of bugs out
there because developers thought that they didn't need to concern
themselves with the implementation details of memory management.

It makes a difference whether your algorithm runs in constant time,
linear, quadratic, logarithmic or exponential time -- or something even
slower. The implementation details of the language can hide quadratic or
exponential algorithms in something that looks like a linear or constant
algorithm. Premature optimization is a sin... but so is unusably slow
code.
 
G

George Sakkis

Andrea Griffini said:
I think that if you don't understand memory,
addresses and allocation and deallocation, or
(roughly) how an hard disk works and what's
the difference between hard disks and RAM then
you're going to be a horrible programmer.

There's no way you will remember what is O(n),
what O(1) and what is O(log(n)) among containers
unless you roughly understand how it works.

There's a crucial distinction between these two scenarios though: the
first one has to do with today's hardware and software limitations
while the second one expresses fundamental, independent of technology,
algorithmic properties. In the not-too-far-future, the difference
between RAM and hard disks may be less important than today; hard disks
may be fast enough for most purposes, or the storage policy may be
mainly decided by the OS, the compiler, the runtime system or a library
instead of the programmer (similarly to memory management being
increasingly based on garbage collection). As programmers today don't
have to know or care much about register allocation, future programmers
may not have to care about whether something is stored in memory or in
disk. OTOH, an algorithm or problem with exponential complexity will
always be intractable for sufficiently large input, no matter how fast
processors become. The bottom line is that there is both fundamental
and contemporary knowledge, and although one needs to be good at both
at any given time, it's useful to distinguish between them.

George
 
M

Mike Meyer

Andrea Griffini said:
I've got "bad" news for you. You're always in the
middle :-D.

That's what I just said.
I simply say that you will not able to avoid
introducing them. If they're going to write software
those are not "details" that you'll be able to hide
behind a nice and perfect virtual world (this is much
less true about bus cycles... at least for many
programmers).

I disagree. If you're going to make competent programmers of them,
they need to know the *cost* of those details, but not necessarily the
actual details themselves. It's enough to know that malloc may lead to
a context switch; you don't need to know how malloc actually works.
But if you need to introduce them, then IMO is
way better doing it *first*, because that is the
way that our brain works.

That's the way *your* brain works. I'd not agree that mine works that
way. Then again, proving either statement is an interesting
proposition.

Sure. I know that many think that starting from
higher levels is better. However no explanation is
given about *why* this should work better, and I
didn't even see objective studies about how this
approach pays off. This is of course not a field
that I've investigated a lot.

The explanation has been stated a number of times: because you're
letting them worry about learning how to program, before they worry
about learning how to evaluate the cost of a particular
construct. Especially since the latter depends on implementation
details, which are liable to have to be relearned for every different
platform.
What I know is that every single competent programmer
I know (not many... just *EVERY SINGLE ONE*) started
by placing firmly concrete concepts first, and then
moved on higher abstractions (for example like
structured programming, OOP, functional languages ...).

I don't normally ask how people learned to program, but I will observe
that most of the CS courses I've been involved with put aside concrete
issues - like memory management - until later in the course, when it
was taught as part of an OS internals course. The exception would be
those who were learning programming as part of an engineering (but not
software engineering) curriculum. The least readable code examples
almost uniformly came from the latter group.

<mike
 
A

Andreas Kostyrka

Yep. Probably. Without a basic understanding of hardware design, one cannot
many of todays artifacts: Like longer pipelines and what does this
mean to the relative performance of different solutions.

Or how does one explain that a "stupid and slow" algorithm can be in
effect faster than a "clever and fast" algorithm, without explaining
how a cache works. And what kinds of caches there are. (I've seen
documented cases where a stupid search was faster because all hot data
fit into the L1 cache of the CPU, while more clever algorithms where
slower).

So yes, one needs a basic understanding of hardware, so that one can
understand the design of "assembly". And without knowledge of these
you get C programmers that do not really understand what their
programs do. (Be it related to calling sequences, portability of their
code, etc.) Again you can sometimes see developers that pose questions
that suggest that they do not know about the lowlevel. (Example from a
current project: Storing booleans in a struct-bit-field so that it's
faster. Obviously such a person never seen the code needed to
manipulate bit fields on most architectures.)

A good C programmer needs to know about assembly, libc (stuff like
malloc and friends and the kernel API).

Now a good python programmer needs to know at least a bit about the
implementation of python. (Be it CPython or Jython).

So yes, one needs to know the underlying layers, if not by heart, than
at least on a "I-know-which-book-to-consult" level.

Or you get perfect abstract designs, that are horrible when
implemented.
Not really. Long ago I've drawn a line that starts at
software. I think you can be a reasonable programmer
even without the knowledge about how to design hardware.
Well, IMHO one needs to know at least a bit. But one doesn't need to
know it well enough to be able to design hardware by himself. ;)
I do not think you can be a reasonable programmer if
you never saw assembler.

Yes. But for example to understand the memory behaviour of Python
understanding C + malloc + OS APIs involved is helpful.
But you better understand how, more or less, your
computer or language works, otherwise your code will
be needless thousand times slower and will require
thousand times more memory than is necessary.
Look a recent thread where someone was asking why
python was so slow (and the code contained stuff
like "if x in range(low, high):" in an inner loop
that was itself pointless).

Andreas
 
C

Claudio Grondi

They're things that can be IMO genuinely accept
as "obvious". Even "counting" is not the lowest
level in mathematic... there is the mathematic
philosohy direction.
I am personally highly interested in become
aware of the very bottom, the fundaments
all our knownledge is build on.
Trying to answer questions like:
What are the most basic ideas all other
are derived from in mathematics and
programming?
keeps me busy for hours, days, years ...

Any insights you can share with
me(and/or this group)?

Claudio
 
P

Peter Maas

Andrea said:
I'm not sure that postponing learning what memory
is, what a pointer is and others "bare metal"
problems is a good idea.

I think Peter is right. Proceeding top-down is the natural way of
learning (first learn about plants, then proceed to cells, molecules,
atoms and elementary particles). If you learn a computer language
you have to know about variables, of course. You have to know that
they are stored in memory. It is even useful to know about variable
address and variable contents but this doesn't mean that you have
to know about memory management. MM is a low level problem that has
to do with the internals of a computer system and shouldn't be part
of a first *language* course.

The concepts of memory, data and addresses can easily be demonstrated
in high level languages including python e.g. by using a large string
as a memory model. Proceeding to bare metal will follow driven by
curiosity.
 
T

Tom Anderson

How far down do you have to go? What makes bytes of memory, data busses,
and CPUs the right level of abstraction?

Why shouldn't first-year CS students study "how a computer works" at the
level of individual logic gates? After all, if you don't know how gates
work, things like address bus decoders, ALUs, register files, and the like
are all just magic (which you claim there is no room for).

Digging down a little deeper, a NAND gate is magic if you don't know how a
transistor works or can't do basic circuit analysis. And transistors are
magic until you dig down to the truly magical stuff that's going on with
charge carriers and electric fields inside a semiconductor junction.
That's about where my brain starts to hurt, but it's also where the quantum
mechanics are just getting warmed up.

It's all true - i wouldn't be the shit-hot programmer i am today if i
hadn't done that A-level physics project on semiconductors.

tom
 
T

Tom Anderson

I won't say that I'm certain about any of this, but I have a very strong
suspicion that the *best* first step in learning programming is a program
very much like the following, which I'm pretty sure was mine:

10 FOR A=1 TO 10: PRINT"Peter is great!": END

10 PRINT "TOM IS ACE"
20 GOTO 10

The first line varies, but i suspect the line "20 GOTO 10" figures
prominently in the early history of a great many programmers.
More importantly by far, *I made the computer do something*.

Bingo. When you realise you can make the computer do things, it
fundamentally changes your relationship with it, and that's the beginning
of thinking like a programmer.

tom
 
T

Tom Anderson

I think that if you don't understand memory, addresses and allocation
and deallocation, or (roughly) how an hard disk works and what's the
difference between hard disks and RAM then you're going to be a horrible
programmer.

There's no way you will remember what is O(n), what O(1) and what is
O(log(n)) among containers unless you roughly understand how it works.
If those are magic formulas you'll just forget them and you'll end up
writing code that is thousands times slower than necessary.

I don't buy that. I think there's a world of difference between knowing
what something does and how it does it; a black-box view of the memory
system (allocation + GC) is perfectly sufficient as a basis for
programming using it. That black-box view should include some idea of how
long the various operations take, but it's not necessary to understand how
it works, or even how pointers work, to have this.

tom
 
R

Roy Smith

Andrea Griffini said:
There's no way you will remember what is O(n),
what O(1) and what is O(log(n)) among containers
unless you roughly understand how it works.

People were thinking about algorithmic complexity before there was random
access memory. Back in the unit record equipment (i.e. punch card) days,
people were working out the best ways to sort and merge decks of punch
cards with the fewest trips through the sorting machine. Likewise for data
stored on magnetic tape.

I can certainly demonstrate algorithmic complexity without ever going
deeper than the level of abstraction exposed by Python. You can learn
enough Python in an afternoon to write a bubble sort and start learning
about O(2) behavior without even knowing what a memory address is.

Somebody mentioned that string addition in Python leads to O(2) behavior.
Yes it does, but that's more an artifact of how Guido decided he wanted
strings to work than anything fundamental about memory allocation. He
could have taken a different design path and made Python strings more like
STL vectors, in which case string addition would be O(n). Teaching that
"string addition is O(2)" is not only needlessly confusing for somebody
just starting out, it's also wrong (or at best, a specific case valid for
one particular implementation).

And, BTW, I started out programming on a big HP desktop calculator
(http://www.hpmuseum.org/hp9810.htm). Next came BASIC. Then Fortan and
assembler on a pdp-10. Then C, a couple of years later. After that, I've
lost track. Some of the languages that taught me the most were ones that
got very far away from the hardware. NewtonScript was my first
introduction to OOPL, and PostScript showed me that stack languages aren't
just for calculators. Lisp, of course, expanded my mind in ways that only
Lisp can (the same could be said for many things I tried back in those
days). Even quirky HyperCard showed me a different way to think about
programming.

I think it's probably just as important for a CS major to play with those
mind-altering languages as it is to worry about bytes and pointers and
memory locations. But you can't start everywhere, and if you've got to
start someplace, Python let's you concentrate on the real universal
fundamentals of data structures, algorithms, and control flow without
getting bogged down in details.
 
F

F. Petitjean

Le Mon, 13 Jun 2005 07:53:03 -0400, Roy Smith a écrit :
Python let's you concentrate on the real universal
fundamentals of data structures, algorithms, and control flow without
getting bogged down in details.

+1 QOTW
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,260
Messages
2,571,308
Members
47,955
Latest member
DarciAntho

Latest Threads

Top