In the matter of Herb Schildt: the question of a "bad" book

W

Walter Banks

Mark said:
As for memory: forget garbage collection. In fact, forget dynamic
memory. These systems are expected to know what they're using and when.
Not vaguely, but precisely. Garbage collection has two major counts
against it in this sphere: the GC itself upsets timing guarantees (which
the authorities still insist on develops specifying*) and the fact that
it implies you haven't cared enough about the life and scope of the
associated data structure(s).

This is essentially true for all embedded systems. On standalone
systems memory requirements must be known in advance, systems
don't have a pool of memory to use and when memory is not
available there can be serious consequences. Embedded compilers
spend a lot of time at compile time allocating RAM based variables
based on program structure and variable scoping.

Walter Banks
 
K

Keith Thompson

Walter Banks said:
This is essentially true for all embedded systems. On standalone
systems memory requirements must be known in advance, systems
don't have a pool of memory to use and when memory is not
available there can be serious consequences. Embedded compilers
spend a lot of time at compile time allocating RAM based variables
based on program structure and variable scoping.

For certain values of "embedded".

The stuff I work on (mobile phone software) runs under an operating
system, and uses dynamic memory all over the place. Then again,
some of these phones are more powerful than desktop computers from
a N years ago (don't ask me what N is, but it's not huge).

For other kinds of embedded systems, I'm sure you're right.
 
B

BruceS

Still fly by wire. Pitot tubes?

As someone programming in the aviation industry, I have to wonder: do
you know what a pitot tube is? Also, does the term DO-200A have any
meaning to you?

And what about C# and Java? While I'm aware of them being used for
noncritical applications that generate data for aviation (which data
is then carefully verified), I'd be interested to hear of any on-board
application using them. Do you know of any, Walter?
 
C

Colonel Harlan Sanders

I'm surprised you found that information, but that's why rednecks like
the Internet. They can sound intelligent (rarely) by simple search,
cut and paste that any Walmart greeter can learn. But hey, thanks,
asshole.

And note that French guys, with job security and three months of
vacation, were able to do it. Americans would not be able to.

The only person who has has been bacarisse. I haven't called him a
**** or a Nazi. But I'll call you one to your face the next time you
are in Hong Kong or I am in East Shithole.

Not sure which is more fatuous, the pompous bore or wannabe tough guy.
The first doesn't impress anyone, the second doesn't frighten anyone.
But at least when you're using four letter words you waste less
bandwidth.

Anyway, Niggler, while I find your namecalling pathetic and foolish,
you apparently take wordplay directed at you more personally. You seem
to be trying to provoke with gutter speech now, as your vile attacks
on Herbert, so I guess you are just looking for a flame war.

But of course you just launched into calling me a redneck and a ****
to divert from the actual question rather than reassessing your
fondly held belief that C causes airplane crashes (as well as
dandruff, BO, and global warming). You'll no doubt keep rambling on
about engineers and bureaucrats; bullying and mother's boys, causing
all the world's problems because they're forced to use C, same as
ever.
 
S

spinoza1111

My wife's research area over the past decade has been safety critical
systems, specifically focused on commercial aircraft.  As part of this,
she has worked closely with the CAA (UK), JAA (Europe) and FAA (US).
There has been a great deal of attention given to ensuring safety as the
industry attempts to increase computer support, particularly as they've
required much more complex controls to keep aircraft in the air - both
Boeing and Airbus's newer aircraft are an order of magnitude more
complex to fly than those designed in the '60s and '70s.

C isn't used for the most critical systems.  Ada (particularly SPARK
Ada) is quite common now as it provides some of the guarantees that the
airlines and regulatory bodies demand.

As for memory: forget garbage collection.  In fact, forget dynamic
memory.  These systems are expected to know what they're using and when..
Not vaguely, but precisely.  Garbage collection has two major counts
against it in this sphere: the GC itself upsets timing guarantees (which
the authorities still insist on develops specifying*) and the fact that
it implies you haven't cared enough about the life and scope of the
associated data structure(s).

While GC might upset timing, it's possible to care too much. All the
programmer needs (IMO) is to specify at what line of code a variable
starts to be meaningful and at what line of code it ceases to be
meaningful.

This rules out older C where the variable is specified at the start of
the function.

Question is then, is "line of code" (syntactically a statement) fine
grained enough? In

a+(b = malloc(100), *b = 0, *b)

which I don't know will even compile much less work, the intention of
the bozo who wrote this code is clear enough (Schildt-clear): it is to
allocate b after a is retrieved.

You might need something more fine-grained for avionics, but this is
provided in C Sharp.

In .Net when you "release" something it doesn't necessarily go away.
Wouldn't this be a Good Thing in safety-critical systems?

Can't the garbage collector be proven, formally, not to ever get into
a state where it's holding things up?

I don't know, would like your input (instead of the usual crap like
we're getting from Colonel and Seebs).
 
S

spinoza1111

As someone programming in the aviation industry, I have to wonder: do
you know what a pitot tube is?  Also, does the term DO-200A have any
meaning to you?

Yes. No. I appreciate your input.
 
S

spinoza1111

//www.astree.ens.fr/[/URL]
: ASTRÉE stands for Analyseur statique de logiciels
I'm surprised you found that information, but that's why rednecks like
the Internet. They can sound intelligent (rarely) by simple search,
cut and paste that any Walmart greeter can learn. But hey, thanks,
asshole.
And note that French guys, with job security and three months of
vacation, were able to do it. Americans would not be able to.
The only person who has has been bacarisse. I haven't called him a
**** or a Nazi. But I'll call you one to your face the next time you
are in Hong Kong or I am in East Shithole.

Not sure which is more fatuous, the pompous bore or wannabe tough guy.
The first doesn't impress anyone, the second doesn't frighten anyone.
But at least when you're using four letter words you waste less
bandwidth.

Anyway, Niggler, while I find your namecalling pathetic and foolish,
you apparently take wordplay directed at you more personally. You seem
to be trying to provoke with gutter speech now, as your vile attacks
on Herbert, so I guess you are just looking for a flame war.  

But of course you just launched into calling me a redneck and a ****
to divert from the actual question  rather than reassessing your
fondly held belief that C causes airplane  crashes (as well as
dandruff, BO, and global warming). You'll no doubt keep rambling on
about engineers and bureaucrats; bullying and mother's boys, causing
all the world's problems because they're forced to use C, same as
ever.

Well, note that as soon as we shrink it to fit your brain, it all
comes out rather like vulture puke.

I never said that C would cause airplane crashes. I said that stupid
clowns showing how they can "control" technology can kill people, and
my source was William Langeweische's book Fly by Wire and his
(possibly biased) description of an A330 crash with fatalities in
which the pilot was overloaded (overloaded himself) with tasks. I got
the aircraft number wrong and was corrected by some people with
interesting technical expertise.

Whereas Sullenberger shed tasks and allowed his airplane and its
software to handle the landing: Langewiesche:

"Suffice it to say that if Sullenberger had done nothing after the
loss of thrust the airplane would have smoothly slowed until reaching
a certain angle with the airflow, at which point it would have lowered
its nose to keep the wings from stalling, and would have done this
even if for some reason Sullenberger had resisted. Of course,
Sullenberger did no such thing. While in the initial left turn he
lowered the nose well in advance of the need for any such
“protection,” and went to the best gliding speed—a value which the
airplane calculated all by itself, and presented to him as a green dot
on the speed scale of his primary flight display. During the pitch
changes to achieve that speed, a yellow “trend” arrow appeared on the
scale, pointing up or down from the current speed with predictions of
speed 10 seconds into the future—an enormous aid in settling onto the
green dot with the minimum of oscillation. Suffice it also to say that
during the glide Sullenberger received no tactile feedback from his
side-stick; that whenever he left the side-stick alone in the neutral
position the airplane held its nose steadily at whatever pitch he had
last selected; that the airplane’s pitch trim was automatic, and
perfect at all times; that all yaw was damped out; that the rudder was
automatically coordinated with the rolls; that having banked to any
angle up to 33 degrees, if Sullenberger left the side-stick alone, the
airplane stayed precisely at the chosen angle; and that, likewise,
having returned to a straight-ahead wings-level position, the airplane
stayed there too, without the slightest drift or wobble. Thank you,
Betsy."

Sullenberger was more like a Java programmer than a C programmer.

Note that there are two different levels or styles of reading here.
The people more knowledgeable than I about aviation, like Walter,
correct me on details, whereas I read the Vanity Fair article by
Langewiesche and the subsequent book, with its more complex sentence
structure. The result? I get the general idea right (don't think you
necessarily need fine grained control) and the details wrong.

I try to unify the two approaches precisely because the lack of hard,
technical input to the policy makers who use the fancy syntax is a
real problem. Airline pilots in the US don't like fly by wire and can,
using their knowledge, overwhelm the channel with all sorts of
scenarios in which the software could go wrong. Furthermore, we know
that fly by wire creates its own issues including the nonzero
probability of program bugs. There's only a few people that can master
the technology without losing their soul in the sense of remembering
that people matter more; most skilled programmers I've known are
soulless twats.

I showed how Kenny is right in saying that if we obey the regs on the
matter of "staying on topic", we wind up debating language law only.
Whereas given an ecological perspective, we are often surprised to
learn of wormholes between "unrelated" topics, as in my example of
underarm deodorant and the South Pole.

Too many programmers confuse "staying on topic" with never questioning
what their skills are used for. For example, brokerage programmers in
New York simply focus on "making the user happy" and then it's Knicks
and chicks for them. The user doesn't care, in turn, that his
derivative's value is dependent on a second derivative, the value of
which is dependent, through a long chain, on the value of the first.

But programmers "stay on topic" for a good reason: they are not,
typically, union members with protection against sudden unemployment
at-will, nor do they have the status or clout of lawyers and doctors.
If the brokerage programmer questions a cyclical derivative, he's "off
topic" and will be at least bitch-slapped.

For this reason, most programmers think like you...they just chain
together ideas sloppily except when thinking in code, where the
apparatus forces complexity on them (but in such a way that they're
unable to think precisely in their own mathematical notation like
Dijkstra, or in my example of thinking of a function def as a tuple).
When they see writing that chains them using complex sentence
structure, they cannot retain but a fraction of what they've read, so
of course, when they digest it and regurgitate it like vultures
feeding on a corpse, it's going to sound nasty.
 
S

spinoza1111

Langeweische contrasts an Air France pilot who killed his passengers
but survived at an air show because he wanted "control", and he
overrode the 380. Clown reminds me of some programmers.

Whereas Sullenberger seems to my layperson's eye to have heroically
contented himself with deciding to land in the river because he
couldn't make Teterboro and pointing the plane in the right direction.

Langeweische's point: being a pilot has become transformed by software
and fly by wire into being nothing more than a glorified bus driver,
and the rapidly declining salaries of pilots reflect this harsh truth.
For the same reason, programmers should get used to the idea of using
Java or C Sharp, with garbage collection, and not C. Programming
should be "boring", not fun.

We still don't have the full story on the mid-Atlantic crash of 2009
which was another 380. But I really, really hope that the fly by wire
engineers do NOT use the C programming language. I'd hope it's a
special purpose language running on an OS with garbage collection
tuned for real-time avionics. And hopefully, the engineers work in
Europe and so can't be laid off at-will, and their skills with
them...including the guy writing the compiler, since the Wall Street
boys don't even know what a "compiler" is.

Any body here work in this type of software, have a contribution to
make?

Here is Langewiesche's description of what Sullenberger did NOT do:

"Suffice it to say that if Sullenberger had done nothing after the
loss of thrust the airplane would have smoothly slowed until reaching
a certain angle with the airflow, at which point it would have lowered
its nose to keep the wings from stalling, and would have done this
even if for some reason Sullenberger had resisted. Of course,
Sullenberger did no such thing. While in the initial left turn he
lowered the nose well in advance of the need for any such
“protection,” and went to the best gliding speed—a value which the
airplane calculated all by itself, and presented to him as a green dot
on the speed scale of his primary flight display. During the pitch
changes to achieve that speed, a yellow “trend” arrow appeared on the
scale, pointing up or down from the current speed with predictions of
speed 10 seconds into the future—an enormous aid in settling onto the
green dot with the minimum of oscillation. Suffice it also to say that
during the glide Sullenberger received no tactile feedback from his
side-stick; that whenever he left the side-stick alone in the neutral
position the airplane held its nose steadily at whatever pitch he had
last selected; that the airplane’s pitch trim was automatic, and
perfect at all times; that all yaw was damped out; that the rudder was
automatically coordinated with the rolls; that having banked to any
angle up to 33 degrees, if Sullenberger left the side-stick alone, the
airplane stayed precisely at the chosen angle; and that, likewise,
having returned to a straight-ahead wings-level position, the airplane
stayed there too, without the slightest drift or wobble. Thank you,
Betsy."

Reminds me more of a Java programmer than a C programmer. But I
welcome comments from people with aviation expertise.

Are the C regs here like pilots who overload themselves with tasks and
auger into the trees by insisting on shibboleths?

But: I need to make a distinction. I don't think (to use memory
allocation) we should go back to requiring all variables to be
allocated, apart from malloc, at the function header. C Sharp
allocation is fine grained: at the statement level for references to
objects, at the level of the expression for objects themselves. But,
you don't know whether the storage is gone when you dispose.

Where do you draw the line? Sullenberger stopped responding to the
tower but didn't make a dead stick landing either.
 
N

Nick Keighley

Mark wrote:

This is essentially true for all embedded systems.

embedded covers a lot of ground. I've worked on standalone systems (no
true OS) that had dynamic memory allocation (though not malloc based)
 
N

Nick Keighley

While GC might upset timing, it's possible to care too much. All the
programmer needs (IMO) is to specify at what line of code a variable
starts to be meaningful and at what line of code it ceases to be
meaningful.

they are worried about the time it takes to run the GC. There are
things out there in the real world where how long an operations takes
(or an upper bound on how long it takes) must be known. The are called
real Time systems. It isn't how many lines of code the variable
remains in scope that matters but how many microseconds (or whatever)
that matters. If they don't meet the times then Bad Things Happen.
Aeroplanes fall from the sky, reactors scram, and your DVD won't play.
This rules out older C where the variable is specified at the start of
the function.

If you're worried about this your function is probably too big

Question is then, is "line of code" (syntactically a statement) fine
grained enough? In

a+(b = malloc(100), *b = 0, *b) [...]
You might need something more fine-grained for avionics, but this is
provided in C Sharp.

how so?
In .Net when you "release" something it doesn't necessarily go away.
Wouldn't this be a Good Thing in safety-critical systems?

not if it causes you to run out of memory...
Can't the garbage collector be proven, formally, not to ever get into
a state where it's holding things up?

dunno. Generational is a lot better than mark-and-sweep but tends to
use more memory and *still* might not be Real Time enough for some.

<snip>
 
M

Mark

spinoza1111 said:
While GC might upset timing, it's possible to care too much. All the
programmer needs (IMO) is to specify at what line of code a variable
starts to be meaningful and at what line of code it ceases to be
meaningful.

This rules out older C where the variable is specified at the start of
the function.

To be fair, it's at the start of the block rather than the function.
Question is then, is "line of code" (syntactically a statement) fine
grained enough? In

a+(b = malloc(100), *b = 0, *b)

which I don't know will even compile much less work, the intention of
the bozo who wrote this code is clear enough (Schildt-clear): it is to
allocate b after a is retrieved.

You might need something more fine-grained for avionics, but this is
provided in C Sharp.

In .Net when you "release" something it doesn't necessarily go away.
Wouldn't this be a Good Thing in safety-critical systems?

Can't the garbage collector be proven, formally, not to ever get into
a state where it's holding things up?

I don't know, would like your input

This isn't really a case of what's possible, but what's permitted. The
regulatory authorities are deeply conservative which, given the
consequences of getting it wrong, is understandable.

As a result, the processes and standards are still fairly rigid. This
is starting to be challenged, but it's baby-steps. In general, the
high cost of aircraft (and the cost of payouts when things go wrong)
mean that making the software engineers use nasty, rigid methodologies
is not a huge increase in cost.

The courts aren't helping, though; the consequence of these very high
liability payouts means that no-one wishes to be seen to take the
decision which leads to the loss of life. In the case of the UK courts,
the law has changed now so that it's much easier to be jailed for such
breaches.

As a result, MISRA C has become very popular in the US automotive
industry and, later, in other industries: no-one in the US wanted
responsibility for setting standards for safety-critical/high-integrity
systems, so (formally or informally) borrowing a different standard
looks more attractive...

In terms of your specific question about a proof for the GC, the level
of analysis involved (and guarantees required by authorities) make it
impractical.

Of course, questions are being asked (as you did) whether you can "care
too much"; the principles used (dating back over 30 years) are being
examined to see if people can feel confident about relaxing these
guarantees. The truth is, though, people mainly trust "flight hours".
If you can twist arms to certify a new system[1] *and* it stays in the air
for long enough (ideally, lots of aircraft, lots of time[2]), you've won
changes by default.

1 (Allegedly) both Boeing (for the 777) and Airbus (A380) managed to
step outside some (but not all) of the traditional constraints...
2 The obvious example where this was an issue is Concorde; amazing
aircraft, but once one was lost, the flight hours per hull-loss
statistic looked very bad. Lose a 747 or two, and that value
remains strong.

Of course, if you have accidents (look at Airbus and their early
fly-by-wire issues) you *know* fingers will be pointed at any deviation
from traditional practice.

(And, one corollary of the need to prove the technology is sound is the
desperation with which airlines seek to find human error in terms of
pilot-error or maintenance error)
 
C

Colonel Harlan Sanders

Well, note that as soon as we shrink it to fit your brain, it all
comes out rather like vulture puke.

Excellent descrition of your post.

I never said that C would cause airplane crashes. I said that stupid
clowns showing how they can "control" technology can kill people,

Of course they might. But you have not the slightest evidence that
would point to who or what is responsible for any particular incident.
So you just make up fantastic scenarios, casting them with your usual
suspects.

For instance, this nutty conclusion is just pulled out of thin air,
only related by proximity to the text you quoted at length:
Sullenberger was more like a Java programmer than a C programmer.

Note that there are two different levels or styles of reading here.
The people more knowledgeable than I about aviation, like Walter,
correct me on details, whereas I read the Vanity Fair article by
Langewiesche and the subsequent book, with its more complex sentence
structure. The result? I get the general idea right (don't think you
necessarily need fine grained control) and the details wrong.

Hilarious. You're right even when you're wrong, because you can read
Vanity Fair.

.... other bollocks omitted.

Regardless of the actual subject, you just proceed with your usual
polemics, looking for a hook to hang an attack C/ Seebach/ Heathfield/
Amerikka/ etc, etc.
 
S

spinoza1111

Excellent descrition of your post.

yeah, I love arguing with dickwads too stupid to think of their own
insults, who like apes pick up mine and throw them back.
 
S

spinoza1111

To be fair, it's at the start of the block rather than the function.

The block comprising the entire function. Modern C (and C Sharp) allow
you to declare at what I would prefer to call the block, which in both
languages is "a list of zero one or more statements surrounded by
curly braces". Older C, older structured Basic, and PL/I (I think)
required declarations to be at the start of the "function".
This isn't really a case of what's possible, but what's permitted.  The
regulatory authorities are deeply conservative which, given the
consequences of getting it wrong, is understandable.

As a result, the processes and standards are still fairly rigid.  This
is starting to be challenged, but it's baby-steps.  In general, the
high cost of aircraft (and the cost of payouts when things go wrong)
mean that making the software engineers use nasty, rigid methodologies
is not a huge increase in cost.

The courts aren't helping, though; the consequence of these very high
liability payouts means that no-one wishes to be seen to take the
decision which leads to the loss of life.  In the case of the UK courts,
the law has changed now so that it's much easier to be jailed for such
breaches.

As a result, MISRA C has become very popular in the US automotive
industry and, later, in other industries: no-one in the US wanted
responsibility for setting standards for safety-critical/high-integrity
systems, so (formally or informally) borrowing a different standard
looks more attractive...

In terms of your specific question about a proof for the GC, the level
of analysis involved (and guarantees required by authorities) make it
impractical.
Of course, questions are being asked (as you did) whether you can "care
too much"; the principles used (dating back over 30 years) are being
examined to see if people can feel confident about relaxing these
guarantees.  The truth is, though, people mainly trust "flight hours".
If you can twist arms to certify a new system[1] *and* it stays in the air
for long enough (ideally, lots of aircraft, lots of time[2]), you've won
changes by default.

Sullenberger cared a lot, but somehow not enough to grab the stick.
Whereas if you can't trust a garbage collector, you're going to spend
valuable time on manual allocation and free.

1 (Allegedly) both Boeing (for the 777) and Airbus (A380) managed to
  step outside some (but not all) of the traditional constraints...
2 The obvious example where this was an issue is Concorde; amazing
  aircraft, but once one was lost, the flight hours per hull-loss
  statistic looked very bad.  Lose a 747 or two, and that value
  remains strong.

The Space Shuttle was operating, according to the official report,
outside of its intended parameters when it dropped its first piece of
insulation foam. Nothing was done because the problem was a "known
bug". It was assumed that because no chunk of foam had the mass to
damage the wing in its path, things would continue the same in an
inverse of the Gambler's Fallacy that also seems fallacious.

You need meta-rules to avoid these kinds of decisions being taken,
don't you? In programming that would IMO rule out language like "my
program works but it has a bug".

Of course, if you have accidents (look at Airbus and their early
fly-by-wire issues) you *know* fingers will be pointed at any deviation
from traditional practice.

According to Langewiesche, the pilot what crashed was traditional in
the sense that he overtasked himself relative to the FBW capabilities
of the Airbus.
 
S

spinoza1111

Excellent descrition of your post.


Of course they might. But you have not the slightest evidence that
would point to who or what is responsible for any particular incident.
So you just make up fantastic scenarios, casting them with your usual
suspects.

For instance, this nutty conclusion is just pulled out of thin air,
only related by proximity to the text you quoted at length:


Hilarious. You're right even when you're wrong, because you can read
Vanity Fair.

... other bollocks omitted.

Regardless of the actual subject, you  just proceed with your usual
polemics, looking for a hook to hang an attack C/ Seebach/ Heathfield/
Amerikka/ etc, etc.

....reminds me of a recent National Geographic article about apes
discovered making tools. A female ape does so, and a "leader male" ape
screams at her.

Reminds me of some fat tech screaming at a female programmer for
developing a shell script and not using his garbage.

Harlan is one of those thugs can't stand it if someone cracks a book
or writes a sentence.

This constant ape-like bullying behavior in this newsgroup HAS GOT TO
STOP. People HAVE GOT TO STOP, like apes, using markers and
shibboleths and start thinking. People have got to stop jumping over
IUT students asking questions and mainland Chinese posting here. They
have got to stop jumping on Navia for coding what he thinks are
innovative solutions that they can't code. They've got to stop jumping
on Kenny for having a sense of humor.

Check out the answers when I asked questions about avionics. Even my
enemies want to show off their knowledge, and this is a good thing.

If you see something that looks wrong, ask a question. For example,
Seebach should have called Schildt on the phone.
 
C

Colonel Harlan Sanders

yeah, I love arguing with dickwads too stupid to think of their own
insults, who like apes pick up mine and throw them back.

I wasn't insulting you, I was agreeing with you. Even you can't be
wrong ALL the time.

And still failing to process any facts that contradict your claims.
Much simpler to just spew abuse, eh?
 
N

Nick Keighley

 The truth is, though, people mainly trust "flight hours".
If you can twist arms to certify a new system[1] *and* it stays in the air
for long enough (ideally, lots of aircraft, lots of time[2]), you've won
changes by default.

2 The obvious example where this was an issue is Concorde; amazing
  aircraft, but once one was lost, the flight hours per hull-loss
  statistic looked very bad.  Lose a 747 or two, and that value
  remains strong.

If I remember correctly it went from being one of safest aircraft
flying to one of the least safe.
 
N

Nick Keighley

The block comprising the entire function.

no. You are mistaken. You don't see much code that uses it but it is
allowed.

[the following isn't intended to make sense as code but to illustrate
a point]

int pippo (int a[], int b[], int n, int t)
{
int i;
for (i = 0; i < n; i++)
{
if (a == t)
{
int j; /* declaration in inner block */
for (j = 0; j < n; j++)
if (b [j] == t)
return i + j;
}
}

return -1;
}

this is allowed in older Cs. This includes C89, K&R C and even the
primordial Algol-60.

OTH they don't allow

int pippo (int a[], int b[], int n, int t)
{
int i;
for (i = 0; i < n; i++)
{
if (a == t)
{
i = i + 1;
int j; /* ERROR! declaration after statement */
for (j = 0; j < n; j++)
if (b [j] == t)
return i + j;
}
}

return -1;
}

C99 of course also allows the above. Your error is reasonable as code
with declarations in inner blocks is quite unusual (in my experience).
Modern C (and C Sharp) allow
you to declare at what I would prefer to call the block, which in both
languages is "a list of zero one or more statements surrounded by
curly braces". Older C, older structured Basic, and PL/I (I think)
required declarations to be at the start of the "function".

The idea is quite old. Declarations in inner blocks go right back to
Algol-60 (the 60 indicates the year). Which bequeathed it to its
children Coral-66 (and Jovial I guess), Pascal, Ada etc.

<snip>

--

"ALGOL 60 was a language so far ahead of its time that it
was not only an improvement on its predecessors but also
on nearly all its successors".
--C.A.R. Hoare
 
M

Malcolm McLean

If I remember correctly it [Concorde] went from being one of safest aircraft
flying to one of the least safe.
That's because people are ignorant of statistics.

Corcorde was probably not significantly safer than any other aeroplane
before the crash, nor significantly more dangerous after the crash. If
you've a small sample of 11 aircraft and only one crash, then it's
hard to see how you can get any power, whatever naalysis ypu use.
 
B

Ben Bacarisse

Nick Keighley said:
no. You are mistaken. You don't see much code that uses it but it is
allowed.

I think it's quite a common usage. This probably just reflects
differences in the code we've seen. I see it a lot for temporaries like
this:

if (<something>) {
double tmp = a;
a = a[j];
a[j] = tmp;
}

The idea is quite old. Declarations in inner blocks go right back to
Algol-60 (the 60 indicates the year). Which bequeathed it to its
children Coral-66 (and Jovial I guess), Pascal, Ada etc.

Not really in Pascal. Variable declarations must go at the top of a
function. It is true that this syntactic concept is called a "block",
but that's just an accident of naming. The compound statements
delimited by begin and end don't constitute blocks in that sense and so
can't contain variable declarations.

<snip>
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,093
Messages
2,570,610
Members
47,230
Latest member
RenaldoDut

Latest Threads

Top