Teaching new tricks to an old dog (C++ -->Ada)

  • Thread starter Turamnvia Suouriviaskimatta
  • Start date
M

Mark A. Biggar

Neither the "American Dental Association" or the "American Disabilities
Act" are really suitable computer computer languages for writing
libraries or anything else for that matter. :)

Ada, on the other hand is very suited to that task. Ada is a name not
an Acronym. Ada is named for Ada Augusta Lady Lovelace. She was the
daughter of the poet Lord Byron and also was the worlds first computer
programmer. She wrote several sample programs intended for Charles
Babbage's never built Analytical Engine.
 
W

Wouter van Ooijen

Which is why other means are needed for quality software. Rigorous testing
and code reviews come to mind.

IMHO one word is missing: Which is why other means are *also* needed
for quality software. Rigorous testing and code reviews come to mind.

Needed (not exhaustive):
- a good definition (requirements etc)
- education
- reviews, brainstorms and other inter-person interactions
- sensible management
- sensible coding standards, *with a sensible escape mechanism*
- good checking tools (Compiler, even for Ada a subset verificator is
often used)
- automtated (regression) test tools
- test coverage check (somtimes even full path coverage check)
- unit tests, integration tests, system tests etc.


Wouter van Ooijen

-- ------------------------------------
http://www.voti.nl
Webshop for PICs and other electronics
http://www.voti.nl/hvu
Teacher electronics and informatics
 
M

Martin Krischik

Wouter said:
I assume you do know that it was not the computer hardware but the
phyiscal paremeters (acceleration) of the rocket itself?

Yes, I do. I was just simplifying the hole rocket as "hardware".
If this accident speaks for anything IMHO it speaks for sensible
management. Which is apparently a problem on both sides of the ocean
:(

Yes indeed.

With Regards

Martin
 
M

Martin Krischik

<veröffentlicht & per Mail versendet>
reviews and testing. Therefore, I tend to look at the development
processes and their "real" CMM rating. I guess the Ariane team went down a
few notches on that project.

Only it was a contractors team and they where not there anymore to be asked
if the Ariane 4 software could be run on the Ariane 5.

That was one of the reaons why the testsuite was not run: The did not have
the personal to do it the they would have to hire and teach new contractors
to do it.

My point stands: management bug.

Martin
 
L

Ludovic Brenta

Martin said:
<veröffentlicht & per Mail versendet>


Only it was a contractors team and they where not there anymore to be asked
if the Ariane 4 software could be run on the Ariane 5.

That was one of the reaons why the testsuite was not run: The did not have
the personal to do it the they would have to hire and teach new contractors
to do it.

My point stands: management bug.

Martin

I thought it was because the telemetry data required to simulate a
launch was classified, and managers decided that the contractors
"didn't need to know" this data. If it weren't for this secrecy, the
contractors would have found the bug on the first simulated run of the
software.
 
G

Guest

message

After reading some ideas about Ada,
My conclusion is that there are some nice ideas out there, but that they
mainly protect against the "sloppy" programmer.
Actually, the inherent type safety, along with the
visibility rules in Ada
do a bit more than "protect against the 'sloppy'
programmer." I wonder
there is real protection against a truly sloppy
programmer, in Ada or
elsewhere.

Then again, perhaps we are all a little sloppy now
and then. I know I sometimes
make stupid mistakes while coding that the Ada
compiler brings to my attention.

The larger issue is how Ada scales up to
programming in the large for safety-critical
software. Few other languages do scale up as well
as Ada. For a small, one-person
job, I'm not sure it matters so much what
programming language you choose. However,
when you are building a large team of programmers
and need high level of confirmability
wrt the inter-relationship of the varioius
modules, Ada serves much better than most
alternatives.

A key idea in Ada, one that I like much better
than in other languages (although this
aspect of Modula-3 is pretty good), is the model
for separate compilation. Space
in this posting does not allow one to do full
credit to this capability, but it is one of
those features of the language that, when used as
it is supposed be used, makes the
team development process so much easier.

For real-time embedded systems, Ada allows easy
navigation from one level of abstraction
to another, and allows that navigation to be safe.
We can, and usually do, design our
programs at a high level of abstraction.
However, when it is required to descend to the
machine level, we can do, but with greater safety
(built-in rules) than one might do with
some other language.

Anyone who has ever made an mistake in pointer
arithmetic knows how entertaining
it is to spend long hours searching for the source
of some run-time fault. Never made
that kind of mistake? You only need to make it
once for it to be memorable.

Some find Ada to be a little difficult to learn at
first. In particular, the rules that govern
something called "visibility" give new programmers
a bit of trouble. For those programmers
who insist on fighting this feature, there is no
end of trouble. I like to use the analogy of
the rotating-brush floor buffer. When you flip
the switch on that buffer, you need to know
just how to control it or it will run away with
you. Once you let the buffer have its way, you
can make subtle little movements to make it go
where you want it to go instead of dragging
you all over the floor.

The more persnickety features of Ada are a lot
like the floor buffer. Once you learn how to
control them, use them to your advantage, and
understand their purpose, the language becomes
easy and fun to use. The problem is that most
programmers fight those features and complain
because they refuse to abide by them. Those who
do learn the visibility rules tend to build
excellent, dependable, and maintainable software,
and with much less sweat, tears, and blood
than the corresponding programmer in language X.
Or is the wrong end of the third-from-the-end
letter of the alphabet?

Richard Riehle
 
G

Guest

Ioannis Vranos said:
Once again, I have nothing against learning Ada, however personally I
like the most powerful languages. The next thing I am going to learn
after C++ (because I haven't learned it all yet), is probably some form
of assembly language.
Ada is every bit as powerful as C++. Just a bit
safer.
For example I like that I can do:
[snipped a bunch of code]

Everything you just coded in C++ is easily done,
but with slightly
different, and definitely safer, syntax. We can
get to the bit level,
the byte level, or the word level for data. In
at least one embedded
system, deployed on a bare-board using an Ada
run-time, that I
know quite well, we inserted actual machine code,
including some
code to disable interrupts temporarily.

Speaking of powerful languages, consider the power
of Ada to allow
you to build concurrent programs directly within
the language. There
is no need to make separate Posix/Unix calls.
Moreover, Ada has,
at present, the most robust model for controlling
mutual exclusion
found in any non-experimental language. If is
really power you need,
especially programming power, Ada will stand
against any competitor.

Richard Riehle
 
G

Guest

"Wouter van Ooijen (www.voti.nl)" <[email protected]>
wrote in message
If you want to realy broaden your perspective I would suggest
something in the lazy-functional field like Haskell.
Haskell has a lot to recommend it. In fact, it is
sad that more programmers
are not schooled in the value of functional
languages.

However, we must select the right tool for the
right job. There are problems
where Haskell would be preferred to Ada.
Large-scale, safety-critical software
systems developed by a team of programmers is not
the domain where I would
choose Haskell, or ML, or Scheme, or Lisp, or most
other functional languages.

Where Ada is the right choice, nearly all the
time, is for large-scale software systems
that involve an equally large number of
developers, and where the software modules
developed by that team must snap together just
right -- no guesswork. This is Ada's
strength. Few languages can compete in this
domain, although many programmers
do try to use less disciplined languages with some
modest success. Eiffel might be
a good alternative, but my preference for that
kind of software is still Ada.

Consider a military commmand and control system, a
complex system with a lot
of requirementss built in. Now, think of this
system in terms of its size: 4.5 million
lines of source code. This is the kind of
project that is perfect for Ada. In fact,
any software system over a half-million lines of
source code should be coded in
Ada. Some authors have set that threshold at 100
KSLOC.

If you have a small, 20 KSLOC software system, go
ahead and use a different
language. Just keep in mind that as that
software grows over time, you might
find yourself wishing you had chosen Ada in the
first place.

Richard Riehle
 
J

Jerry Coffin

Ludovic Brenta wrote:

[ ... ]
Yes, assembly is the most powerful and flexible language. That's why
all compilers emit assembler.

Not so -- machine language is clearly more flexible than asembly
language (especially on machines where an operation can be encoded in
more than one way). Not all compilers emit assemly language output
either. "Powerful" is meaningless WRT a langauge unless you define what
you mean by it in considerably more detail than I've seen thus far in
this thread (or any other, for that matter).

As an aside: an "assembler" is a program that takes input in "assembly
language" and produces an object file as output. Calling the language
"assembler" is roughly equivalent to referring to Ada as "compiler" --
wrong, and to anybody who isn't entirely clueless about the subject at
hand, downright stupid. I realize that for years IBM (among others)
abused the (English) language by referring to the language as
"assembler", but please avoid their mistake.

[ ... ]
Here, Ada makes it explicit that unsafe programming is taking place.
First, Obj must be declared as "aliased", which means that two or
more paths can access it. In our case, Obj and Obj_As_String are
the two paths. This is another of Ada's nice safety-related
features. Since aliasing must be made explicit, the reader of the
program knows up front whether or not aliasing takes place. The
reader of a C++ program has no such knowledge.

Nonsense -- in C++ you use a reinterpret_cast, which is equally
explicit about what's being done. If somebody reading C++ doesn't
recognize what a reinterpret_cast means, then he simply doesn't know
C++.
Also, the writer of the program must
think twice, and understand the consequences if they make an object
aliased.

Anybody who uses a reinterpet_cast without a second (and third) thought
simply isn't a programmer, and of he wrote Ada instead, it'd still be
garbage.
Secondly, the representation clause for Obj_As_String ("for
Obj_As_String'Address use ...") says exactly what is happening.

Anybody who thinks that (for example):

unsigned char *a = reinterpret_cast<char *>(&x);

doesn't state exactly what it happening, simply doesn't know C++.

Any language (programming or otherwise) is foreign to those who don't
know that language. It may well be that you don't realize what it
means, and that's perfectly fine -- but assuming it must be inexact
because you don't know exactly what it means is considerably less fine.
I could make the code less verbose by using use clauses, similar to
"using namespace std" which you seem fond of. In avionics, our
coding standards forbid that because we want everything to be
explicit.

A poor idea. Just for example, consider writing a generic sorting
function. It needs to swap items that it's sorting. In well-written
C++, this will often be done with a using clause. Specifically, if the
type of items has provided its own specialized version of swap, then my
sorting functino should use that, but otherwise it should use std::swap
to swap them.

If I try to specify whatever_type::swap(x,y), then compilation will
fail if the type has not provided a swap function. Conversely, if I
specify std::swap(x,y), then the specialized swap function won't be
used for those types that provide one.

The solution is something like:

using namespace std;

template<class T>
void sort // ...

// ...
swap(x,y);

and now, thanks to Koenig lookup, this will refer to a swap
specifically for the type of x and y if there is one, but will use the
swap in the standard library for those (many) types that don't provide
special swapping code.

[ ... ]
Hear, hear!

Actually, having used both (as well as Verilog and VHDL, which are
based fairly close on C and Ada respectively) I'm not particularly
convinced this is true.

Personally, I think the _vast_ majority of the safety of Ada is an
illusion. In the end, code that works well is a product of a good
programming doing his job well, NOT of a particular language.

Now, it's certainly true that people can (and frequently do) cite
statistics showing that code written in Ada has fewer bugs, etc., as
proving that the language is safer. Even assuming the citations are
correct (which I'm not sure is true, but for the moment, let's assume
they are), they don't necessarily prove that -- or much of anything
else, for that matter.

The problem is that the reputation of a language tends to become a
self-fulfilling prophecy. Managers who are running safety critical
projects often choose Ada because they "know" it's safer -- and then
run their projects in ways that would assure solid results, regardless
of implementation language.

Likewise, programmers who are strongly attracted toward disciplined
software engineering, will often be attracted to Ada because it has
that reputation (and to an extent, that "feeling" as well).

At the opposite extreme, the managers who are most interested in
pushing a product out the door in minimal time and don't mind bugs,
rarely choose Ada -- and run their projects in ways that would produce
buggy products regardless of language. Likewise, the "cowboy"
programmers never learn Ada at all -- as soon as they learn of its
reputation, they avoid it like the plague.

As such, showing causation (rather than mere correlation) becomes
essentially impossible at best -- and here in the real world, the truth
could even be exactly the opposite of what the statistics "prove."

Then again, all of the above should probably be taken with a large
grain of salt. That wouldn't necessary if what I'd been consuming for
the last couple of hours was salt, but thanks to Warren Winiarski, that
wasn't the case... :)
 
J

Jerry Coffin

(e-mail address removed) wrote:

[ ... ]
However, we must select the right tool for the
right job. There are problems
where Haskell would be preferred to Ada.
Large-scale, safety-critical software
systems developed by a team of programmers is not
the domain where I would
choose Haskell, or ML, or Scheme, or Lisp, or most
other functional languages.

Here, at least you're being real: you admit that this is what you would
choose.

[ ... ]
In fact,
any software system over a half-million lines of
source code should be coded in
Ada.
Here, however, you lose your grip on reality. This is NOT "in fact" --
it's purely an OPINION! It's certainly possible to find projects that
would involve more than a half-million lines of code for which Ada
would be _extremely_ poorly suited, at best.
Some authors have set that threshold at 100 KSLOC.

Other authors have set Pi to 3 (exactly).

Clearly "some authors" can be quoted as saying just about anything you
want, no matter how stupid it might be.
If you have a small, 20 KSLOC software system, go
ahead and use a different
language. Just keep in mind that as that
software grows over time, you might
find yourself wishing you had chosen Ada in the
first place.

This is true -- but remains at least equally so if you exchange "Ada"
and "some other language"!
 
I

Ioannis Vranos

Jerry said:
Anybody who thinks that (for example):

unsigned char *a = reinterpret_cast<char *>(&x);

unsigned char *a = reinterpret_cast<unsigned char *>(&x);


I agree with most of the rest BTW. :)


Also since Ada is more ancient than C++ in terms of a final standard, we
can expect that some things are "more ancient", but it is still an
interesting language since it can do low level stuff.


I am not sure it is "safer" than C++ too, I am suspicious of "safe"
languages.
 
D

Dr. Adrian Wrigley

Ludovic Brenta wrote:

Nonsense -- in C++ you use a reinterpret_cast, which is equally
explicit about what's being done. If somebody reading C++ doesn't
recognize what a reinterpret_cast means, then he simply doesn't know
C++.


Anybody who uses a reinterpet_cast without a second (and third) thought
simply isn't a programmer, and of he wrote Ada instead, it'd still be
garbage.

Isn't there some confusion here?

Surely the "aliasing" issue (ignored by C++ completely(?)) is largely
independent if the "reinterpret_cast"/"Unchecked_Conversion" issue?

The C++ programmer uses aliasing routinely and without thinking.
Ada makes the aliasing possibility explicit when necessary, but
prohibits it otherwise.

If we're talking about the "reinterpret_cast" issue, it is essentially
identical in Ada.
 
H

Hans Malherbe

support efficient, real-time safe environments

Can you explain the "real-time" part?

Reading this thread, it seems to me Ada's focus is on safety rather
than efficiency.
These safety constraints also tend to limit expressiveness. Not that
safety is bad, just that it's not free.
C++ was designed to produce an object-orientated
extension to C.

An all too common misconception.
Even if it was, it is used today in ways the designers could never have
foreseen or thought possible.
 
?

=?ISO-8859-1?Q?Falk_Tannh=E4user?=

Jerry said:
A poor idea. Just for example, consider writing a generic sorting
function. It needs to swap items that it's sorting. In well-written
C++, this will often be done with a using clause. Specifically, if the
type of items has provided its own specialized version of swap, then my
sorting functino should use that, but otherwise it should use std::swap
to swap them.

If I try to specify whatever_type::swap(x,y), then compilation will
fail if the type has not provided a swap function. Conversely, if I
specify std::swap(x,y), then the specialized swap function won't be
used for those types that provide one.

The solution is something like:

using namespace std;

template<class T>
void sort // ...

// ...
swap(x,y);

and now, thanks to Koenig lookup, this will refer to a swap
specifically for the type of x and y if there is one, but will use the
swap in the standard library for those (many) types that don't provide
special swapping code.

I would put the "using namespace std;", or even better, just
"using std::swap;" into the "sort" function, at the scope of the
block from where the "swap" is called. This way, the precise purpose
of the "using" declaration becomes clear to both the human reader
and the compiler (because unwanted side effects due to name
collisions are avoided).
Furthermore, the implementer of "whatever_type" should consider to
put the specialisation of "swap" into the "std" namespace, which
is possible by § 17.4.3.1/1 of the Standard.

Falk
 
I

Ioannis Vranos

Dr. Adrian Wrigley said:
enlighten us please!


C++ is a multiparadigm language and supports 4 paradigms. Each paradigm
is supported *well* with optimal space and time efficiencies.

It does not enforce a specific paradigm, but allows the mixing of them
as the programmer thinks it fits better for a specific problem.
 
J

Jerry Coffin

Falk Tannhäuser wrote:

[ ... ]
I would put the "using namespace std;", or even better, just
"using std::swap;" into the "sort" function, at the scope of the
block from where the "swap" is called.

Oh, of course. When you're making something visible, you nearly always
want to limit the visibility as much as practical.
This way, the precise purpose
of the "using" declaration becomes clear to both the human reader
and the compiler (because unwanted side effects due to name
collisions are avoided).
Furthermore, the implementer of "whatever_type" should consider to
put the specialisation of "swap" into the "std" namespace, which
is possible by § 17.4.3.1/1 of the Standard.

A perfectly reasonable possiblity, but the sort function shouldn't
depend on it.
 
J

John Hudak

Peter said:
We agree here. C++ is a "hackers language", in part because of its C roots.




This is inherited from Pascal if I remember correctly. Of course, good C++
style is to declare your variable in the loop.



This seems ridiculous. I would expect a programmer to know the precedence
rules or at least insert parentheses if they are in doubt.




I like that idea. It is possible using templates, of course. Is it general
enough? If you replace "apples" with "weight" and "oranges" with "length",
is it then permissible to multiply a length with a weight but not add the
two together?




This point sounds as if it restricts the environments where Ada can be used.



You can't do so in C++ either. (C has the conversion to/from void*).




I like that one to.




I sort of like this one as well - although raising an exception seems to be
to forgiving.
My conclusion is that there are some nice ideas out there, but that they
mainly protect against the "sloppy" programmer.

Exactly - because not every programmer is well organized to keep all the
nuances in their head, and observe them when coding. Furthermore,
when components are integrated and one component talks to another is
when the big debugging problems surface. One has to look at the
history/motivation of Ada development versus that of C/C++...Ada
certified compilers and tools strictly enforce the semantics of the
language. It has been my experience that there is a lot of variability
in C/C++ compilers in how through language semantics are adhered to.
 
J

John Hudak

Ioannis said:
Every variable is visible inside the scope that it was defined.

If you want to use i inside a for loop only, then define it in the for
loop.


The restriction that you imply you desire, limits flexibility.

Once again, I have nothing against learning Ada, however personally I
like the most powerful languages. The next thing I am going to learn
after C++ (because I haven't learned it all yet), is probably some form
of assembly language.

What????????????????????????????????????????????????? Assembly language?
powerful? If you decide to write a X GUI interface in assembly, I'll
check in on you in 20 years to see how your doing.... Higher level
languages handle broader abstract concepts better than low level
languages. Assembly is great if you are optimizing device drivers or
banging bits at the hardware register level...not good at implementing
scientific algorithms, GUIs, databases, etc. There are thousands of
reasearch papers that extol the problems of assembly language approach
to things, and yes, there are places and reasons to use it, but
classifying it as a 'higher level language' and something more powerful
is incorrect....
-John
For example I like that I can do:

#include <iostream>
#include <string>
#include <bitset>
#include <limits>

class SomeClass
{
std::string s;

public:
SomeClass()
{
s="This is a text message";
}
};


int main()
{
using namespace std;

SomeClass obj;

unsigned char *p= reinterpret_cast<unsigned char *>(&obj);

// Displays the individual bytes that obj
// consists of as unsigned chars.
for(unsigned i=0; i<sizeof(obj); ++i)
cout<<"character: "<<p<<"\n";

cout<<"\n";


p= reinterpret_cast<unsigned char *>(&obj);
// Displays the decimal values of the
// individual bytes that obj consists of.
for(unsigned i=0; i<sizeof(obj); ++i)
cout<<static_cast<unsigned>(p)<<" ";

cout<<"\n\n";


// Displays the bits of each byte that consist
// this SomeClass object
p= reinterpret_cast<unsigned char *>(&obj);
for(unsigned i=0; i<sizeof(obj); ++i)
{
// A byte is not necessarily 8 bits
// numeric_limits<unsigned char>::digits retrieves the number
// of byte's bits, which *is* 8 usually.
bitset<numeric_limits<unsigned char>::digits> bits(p);

for(unsigned j=0; j<bits.size(); ++j)
cout<<bits[j];

cout<<"\n";
}
}


C:\c>temp
character: â¿
character: =
character: >
character:

252 61 62 0

00111111
10111100
01111100
00000000

C:\c>


I am sure that many ADA developers will say that this one is not needed
(the ability to access the individual bytes of objects is needed in many
cases, e.g. to create low level exact copies of objects ) and it is
unsafe (yes it is, low level stuff are unsafe and it all depend on the
programmer knowing what he does).


It is up to oneself to learn whatever languages fits his purposes. For
example, a "safe" language is not an interest for me.

Someone who places much hopes on the language to protect him from his
mistakes, probably ADA is better than C++ on this.


There is no language that provides satisfies all people desires, just
because some desires are in contrast between them.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,206
Messages
2,571,069
Members
47,675
Latest member
RollandKna

Latest Threads

Top