M
Mark Lawrence
I did that, but my fee was a case of beer.
Pay bottle get monkey?
I did that, but my fee was a case of beer.
This dove-tailer understands Rapid Application Development ....
http://woodwork.ars-informatica.ca/tool.php?art=dovetail_video
Frank Klausz's three-minute dovetails using a bow saw
I've always thought C was a great language for low-level, bare-metal,
embedded stuff -- but teaching it to first or second year computer
science students is just insane. C has a certain minimalist
orthogonality that I have always found pleasing. [People who smile
wistfully when they think about the PDP-11 instruction word layouts
probably know what I mean.]
But, exposure to C should wait until you have a firm grasp of basic
algorithms and data structures and are proficient in assembly language
for a couple different architectures. Ideally, you should also have
written at least one functioning compiler before learning C as well.
I can't think of a reference, but I to recall that
bugs-per-line-of-code is nearly constant; it is not language
dependent. So, unscientifically, the more work you can get done
in a line of code, then the fewer bugs you'll have per amount of
work done.
Enter the (One-Liner) Dragon!
I've always thought C was a great language for low-level, bare-metal,
embedded stuff -- but teaching it to first or second year computer
science students is just insane. C has a certain minimalist
orthogonality that I have always found pleasing. [People who smile
wistfully when they think about the PDP-11 instruction word layouts
probably know what I mean.]
I agree with you here, but wasn't there a tie-in between C and the rise
of Unix via universities, or am I barking in the wrong forest?
Wolfgang Keller said:C is just a kafkaesque mess invented by a sadistic pervert who must
have regularly consumed illegal substances for breakfast.
Grant Edwards said:Ideally, you should also have written at least one functioning
compiler before learning C as well.
It wasn't for me either when I went to college in the late 1970's.
Pascal first, then FORTRAN, then IBM 360 assembler. That was all the
formal language training I had. (I had taught myself BASIC in high
school.)
There's very few mysteries in C.
You never have to wonder what the
lifetime of an object is,
or be mystified by which of the 7 signatures
of Foo.foo() are going to get called,
or just what operation "x + y" is
actually going to perform.
Apart from "What the hell does this piece of code actually do?". It's no
coincidence that C, and Perl which borrows a lot of syntax from C, are
the two champion languages for writing obfuscated code.
And "What does 'implementation-specific undefined behaviour' actually
mean in practice?", another common question when dealing with C.
And most importantly, "how many asterisks do I need, and where do I put
them?" (only half joking).
Since C isn't object oriented, the lifetime of objects in C is, um, any
number you like. "The lifetime of objects in <some language with no
objects> is ONE MILLION YEARS!!!" is as good as any other vacuously true
statement.
Is that even possible in C? If Foo is a struct, and Foo.foo a member, I
don't think C has first-class functions and so Foo.foo can't be callable.
But if I'm wrong, and it is callable, then surely with no arguments there
can only be one signature that Foo.foo() might call, even if C supported
generic functions, which I don't believe it does.
There's very few mysteries in C. You never have to wonder what the
lifetime of an object is
, or be mystified by which of the 7 signatures
of Foo.foo() are going to get called
, or just what operation "x + y" is
actually going to perform.
If you maim yourself with a razor-sharp chisel, do you blame the chisel
for being a bad tool?
Yes you do. Lifetimes are hard, because you need to malloc a lot, and
there is no defined lifetime for pointers -- they could last for just
the lifetime of a stack frame, or until the end of the program, or
anywhere in-between, and it's impossible to know for sure, and if you
get it wrong your program crashes. So there's all these conventions
you have to come up with like "borrowing" and "owning", but they
aren't compiler-enforced, so you still have to figure it out, and you
will get it wrong. Successors like C++ mitigate these issues with
destructors (allowing heap-allocated stuff to be tied to the lifetime
of a stack), and smart pointers and so on.
Wrong. A pointer is a scalar value, usually some kind of integer, and
its lifetime is the same as any other scalar.
Heap memory's lifetime
is also very simple: it lasts until freed.
The duration of a pointer's validity is far more interesting, and that
is why it is the primary meaning of the term "pointer lifetime". Also,
it's obviously what I meant.
Sometimes simple things are hard to use correctly. I only said it was
hard, not complicated.
And "What does 'implementation-specific undefined behaviour' actually
mean in practice?", another common question when dealing with C.
Since C isn't object oriented, the lifetime of objects in C is, um, any
number you like. "The lifetime of objects in <some language with no
objects> is ONE MILLION YEARS!!!" is as good as any other vacuously true
statement.
Is that even possible in C? If Foo is a struct, and Foo.foo a member, I
don't think C has first-class functions and so Foo.foo can't be callable.
With no operator overloading, that one at least is correct.
Only asked by people who haven't had it explained. There's "undefined
behavior", and there's "implementation-specific behavior", but it is
impossible to have "implementation-specific undefined behavior".
And, the definitions are simple to understand: "undefined behavior"
means that if your program invokes it, there is no definition of what
will happen. This is buggy code.
"Implementation-specific" behavior means that the standard requires the
implementation to do some well-defined thing, but the standard does not
define exactly what it must be. You can go look up what your
implementation will do in its documentation (the standard requires that
it be documented), but you can't assume the same thing will happen in
another implementation. This is non-portable code.
It's a very rare language indeed that has no undefined or
implementation-specific behaviors.
Python gets to "cheat" by having one
reference implementation. Every time you've had to go try something out
in the Python interpreter because the documentation didn't provide the
details you needed, that WAS implementation-specific behavior.
The implication that only an "object oriented" language could have a
concept of object lifetimes is false.
Of course that's valid C. It's true that C doesn't have first-class
functions, but it supports invoking functions through pointers and you
can store functions in data members, pass functions as arguments, and
return functions from other functions, so Foo.foo can certainly be
callable.
I thought APL would beat both of them, though you're right that the
International Obfuscoted Python Code Contest would be a quite different
beast. But maybe it'd be just as viable... a competent programmer can
write unreadable code in any language.
You mean like mutating locals()? The only difference is that there are a
lot more implementations of C than there are of Python (especially
popular and well-used implementations). There are plenty of things you
shouldn't do in Python, but instead of calling them
"implementation-specific undefined behaviour", we call them "consenting
adults" and "shooting yourself in the foot".
The one differentiation that I don't like is between the . and ->
operators. The distinction feels like syntactic salt. There's no context
when both are valid, save in C++ where you can create a "pointer-like
object" that implements the -> operator (and has the . operator for its
own members).
Lifetime still matters. The difference between automatic and static
variables is lifetime - you come back into this function and the same
value is there waiting for you. Call it "values" or "things" instead of
"objects" if it makes you feel better, but the consideration is
identical. (And in C++, it becomes critical, with object destructors
being used to release resources. So you need to know.)
Well, okay. In C you can't have Foo.foo().
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.