assert vs. std::logic_error?

W

werasm

Hi all,

Care to share your thoughts on this, or point me to
some thoughts already shared.

My thoughts are like this (from a systems point of view):

When I have logic errors outside of my system (or software)
boundaries I throw a logic error (interface specification not
met, etc). This implies the system is logically erroneous. I
do not regard things like failure to read files or access hard-
ware in this light. For those I use runtime error (but that
steers from the question).

When I have errors due to requirements change inside my
software boundaries that cause code to break, I assert. I
try and do this in code that I know will be tested (this
may be hard to determine, but typically startup code).

Apart from above mentioned, I'm still on two minds on the topic.
Any other thoughts welcome.

Regards,

Werner
 
A

Alan Woodland

werasm said:
Care to share your thoughts on this, or point me to
some thoughts already shared.

My thoughts are like this (from a systems point of view):

When I have logic errors outside of my system (or software)
boundaries I throw a logic error (interface specification not
met, etc). This implies the system is logically erroneous. I
do not regard things like failure to read files or access hard-
ware in this light. For those I use runtime error (but that
steers from the question).

When I have errors due to requirements change inside my
software boundaries that cause code to break, I assert. I
try and do this in code that I know will be tested (this
may be hard to determine, but typically startup code).

Apart from above mentioned, I'm still on two minds on the topic.
Any other thoughts welcome.

When I'm writing a library I tend to take the view that asserts catch
(and arguably) partially document conditions that you never expect to
occur during use of the library, as soon as they occur. In my view the
triggering of an assert always is indicative of a bug within my library,
and ideally if my library were bug free there would be no way a user of
the library could cause an assert to fail.

Anything else a user can do to make things go awry gets handled through
an exception of some description, and the user of the library is then
free to catch it and handle it in some sane way, or allow it to
propagate all the way back back and then fix the problem in their code.


Alan
 
W

werasm

When I'm writing a library I tend to take the view that asserts catch
(and arguably) partially document conditions that you never expect to
occur during use of the library, as soon as they occur. In my view the
triggering of an assert always is indicative of a bug within my library,
and ideally if my library were bug free there would be no way a user of
the library could cause an assert to fail.

Anything else a user can do to make things go awry gets handled through
an exception of some description, and the user of the library is then
free to catch it and handle it in some sane way, or allow it to
propagate all the way back back and then fix the problem in their code.

Alan

Yes, I'm thinking from a systems points of view (as that is what I'm
implementing), but I suppose I could see a library as a my
system and the lib user as an external system. Therefore throw
logic_error (or derived) to indicate wrong use, and assert to
indicate "my logic error" so to speak?

Werner
 
E

Erik Wikström

Hi all,

Care to share your thoughts on this, or point me to
some thoughts already shared.

My thoughts are like this (from a systems point of view):

When I have logic errors outside of my system (or software)
boundaries I throw a logic error (interface specification not
met, etc). This implies the system is logically erroneous. I
do not regard things like failure to read files or access hard-
ware in this light. For those I use runtime error (but that
steers from the question).

When I have errors due to requirements change inside my
software boundaries that cause code to break, I assert. I
try and do this in code that I know will be tested (this
may be hard to determine, but typically startup code).

Apart from above mentioned, I'm still on two minds on the topic.
Any other thoughts welcome.

The way I see it an assert should only trigger if there is something
wrong with your code and an exception should be thrown if something goes
wrong during the execution. In other words, you use asserts to check the
invariants that your code should preserve. As Alan said: a triggered
assert indicates a bug.
 
J

James Kanze

The way I see it an assert should only trigger if there is
something wrong with your code and an exception should be
thrown if something goes wrong during the execution. In other
words, you use asserts to check the invariants that your code
should preserve. As Alan said: a triggered assert indicates a
bug.

I think his problem is more semantic. The name logic_error
sounds very much like a programming error, which should usually
be handled by an assert, and not an exception.

Roughly speaking, I tend to think of the opposition logic_error
vs. runtime_error as something like domain_error vs.
range_error. It is a logic_error to call sqrt() with a negative
value, a runtime_error if exp() overflows. Depending on
context, either or both can be considered as a programming error
(and thus should be handled by an assert)---the programmmer
didn't sufficiently validate his input. In practice, however, I
can imagine that there are cases where catching the exception is
the simplest way of validating the input: rather than an
explicit test up front (which could be extremely complicated, if
the later processing involves complex expressions or a lot of
iteration), just catch the exception, and do whatever you would
have done if the input didn't validate. (Now if only the math
routines were guaranteed to generate such exceptions:).)
 
W

werasm

I think his problem is more semantic. The name logic_error
sounds very much like a programming error, which should usually
be handled by an assert, and not an exception.

Yes precisely, that was what originally caused me to ask the
question.

Werner
 
R

Roland Pibinger

When I have errors due to requirements change inside my
software boundaries that cause code to break, I assert.

asserts are (manual) code instrumentations in order to find bugs in
your program. asserts are never used in (compiled into) production
code (at least not by people who understand asserts).
I
try and do this in code that I know will be tested (this
may be hard to determine, but typically startup code).

asserts and tests are not related. Production code which is tested
contains no assets.
 
W

werasm

asserts are (manual) code instrumentations in order to find bugs in
your program. asserts are never used in (compiled into) production
code (at least not by people who understand asserts).

This is quite presumptuous, assuming that production code cannot
have bugs, don't you think. That said, I realize asserts don't
necessarily fire (or assert) with production code.
asserts and tests are not related. Production code which is tested
contains no assets.

If I read this right, this implies "Production code which is not
tested
does contain asserts". Now to distinguish between production code
that is tested and production code that is not tested.

Yes, I've read what XP advocates say. "Write a test that fails and
then write the code to make it pass". This works for unit testing,
or testing your classes (even realizing your classes). Are you sure of
your
results when you have interaction with numerous sub-systems
asynchronously?
This said, often when unit testing is done, one does not communicate
with
true implementation, as this only exists (or may exist) some time
after
the class has been completed. It stands to reason that the class
should
eventually be retested with the true interfaces, yes (I won't comment
on what I've experienced happens in practice, for I might be
stoned :).

Often conditions causing bugs can only arise in a real environment, as
simulated input is not adequate to produce the conditions that make
the asserts fail. This is (or may be) when asserts really become your
friend... I have a classic example (that testing did not show) where
we got a NAN from one of the sub-systems that we communicated with (at
the
time not even knowing what a NAN was). The system obviously crashed
at some point (which seemed random) and it was a very busy system.

Took us some time to find this (and the code was a whole lot cleaner
after the NAN episode).

The above case though should probably be attributed to a lack of
experience.

Regards,

Werner
 
A

Alf P. Steinbach

* Roland Pibinger:
asserts and tests are not related. Production code which is tested
contains no assets.

That's a good way to transform easily identifiable bugs into more
nebulous general "instability".

Cheers, & hth.,

- Alf
 
J

James Kanze

asserts are (manual) code instrumentations in order to find
bugs in your program. asserts are never used in (compiled
into) production code (at least not by people who understand
asserts).

I suppose you wear a lifejacket when in the harbor, and take it
off when you go to sea as well.

No engineer would remove the asserts from code unless the
profiler said it was necessary. The assert facility was, in
fact, carefully designed to allow removing isolated asserts,
without removing them globally, so that you could remove only
the ones the profiler said were necessary, without removing the
others. The assert facility was also carefully designed so that
asserts would be active by default; you have to take explicit
steps to turn them off.
 
R

Roland Pibinger

I suppose you wear a lifejacket when in the harbor, and take it
off when you go to sea as well.

The 'assert lifejacket' is filled with lead. It sinks (crashes) you at
the first possible moment; assert never rescues anyone.
No engineer would remove the asserts from code unless the
profiler said it was necessary.

Name one publicly known C/C++ Unix- or Windows-program that ships with
asserts, client or server (MS-Word, Firefox, Unix-tools (sed,
grep,...), Apache, ... which??).
The assert facility was, in
fact, carefully designed to allow removing isolated asserts,
without removing them globally, so that you could remove only
the ones the profiler said were necessary, without removing the
others.

Nope, assert is controlled by the global compiler define NDEBUG. It
was carefully designed as a simple but effective tool to find bugs
during the development phase. It wasn't designed for error reporting
in released programs.
The assert facility was also carefully designed so that
asserts would be active by default; you have to take explicit
steps to turn them off.

'assert' is an overloaded term. It seems that you mix assert as in
C/C++, Assert as in DbC, error and exception handling. In C++ assert
is meant and only useful as a bug detection tool during developement.
Released programs may have 'feedback agents' or other crash reporting
mechanisms but that's an entirely different breed.
 
R

Roland Pibinger

* Roland Pibinger:

That's a good way to transform easily identifiable bugs into more
nebulous general "instability".

Any real-world example for programs that ships with asserts?
Another argument against asserts in production code is performance:
asserts significantly slow down the program (if they don't you haven't
used enough asserts in your code).
BTW, asserts seem to be a C and C++ only thing. For my work I mostly
use Java and I have neither used nor missed asserts. They are not
necessary in that environment.
 
A

Alf P. Steinbach

* Roland Pibinger:
Any real-world example for programs that ships with asserts?

Hm, let's see, the program I'm using to type this? Just to take the
first one that comes to mind, since it's right in front of me. For your
convenience I now googled "Thunderbird assert error", clicked on first
hit, and there you have it, <url:
http://bugs.opensolaris.org/view_bug.do;jsessionid=de960c54a06d20accda05f27018?bug_id=6429104>.

Another argument against asserts in production code is performance:
asserts significantly slow down the program (if they don't you haven't
used enough asserts in your code).

Sorry, that's mostly bullshit, in the technical meaning of bullshit. A
slowdown due to assertions means you're using assertions incorrectly, to
check too complicated stuff. Like checking an invariant of a
complicated data structure by traversing it, for every operation. An
assertion tells what you think must be true at a given point, but there
is almost infinitely much than one thinks must be true, and it's
misguided to try to assert it all: an assertion is for critical stuff
that is relevant at the point of the assertion.

But, that brings up an important point, also mentioned by James Kanze
else-thread (I think).

Namely that there is a runtime cost associated, an engineering
trade-off, and so you really want to have an assertion system in place
where you can shut down the most costly ones. Checking
costly-to-compute invariant of data structure all the time isn't
meaningful after that class or set of classes has been debugged and
tested, but leaving more inexpensive asserts in place is meaningful,
because it can catch remaining logic bugs.

BTW, asserts seem to be a C and C++ only thing. For my work I mostly
use Java and I have neither used nor missed asserts. They are not
necessary in that environment.

Try to google up "java assert".


Cheers, & hth.,

- Alf
 
J

James Kanze

The 'assert lifejacket' is filled with lead. It sinks
(crashes) you at the first possible moment; assert never
rescues anyone.

Obviously, you've never worked on a critical system. Or any
robust software. If a program is not working correctly, the
most desirable behavior is almost always that it crash as
rapidly as possible, so that the error is recognized
immediately.
Name one publicly known C/C++ Unix- or Windows-program that
ships with asserts, client or server (MS-Word, Firefox,
Unix-tools (sed, grep,...), Apache, ... which??).

Name one significant process control system or major server
which doesn't have assert's active. The version of Apache I
manage certainly has its asserts active. As do all of the other
servers running where I work.

As for the other programs you mention: they crash often enough
that something is detecting fatal errors. Maybe if Firefox did
use asserts, I could get more information than just segment
violation. (Off hand, it wouldn't surprise me if there wasn't a
single assert in grep or sed. But the versions I've seen
delivered are compiled without defining NDEBUG.)
Nope, assert is controlled by the global compiler define NDEBUG.

I'd suggest that you reread the standard. The insert header is
explicitly specified to not be idempotent; you can include it as
many times as you wish, and each time, it redefines the
assertion behavior according to the current state of NDEBUG.

And I've never seen NDEBUG defined in the invocation line of the
compiler. Only an idiot would do something that stupid. You
define it only when you absolutely need to.
It was carefully designed as a simple but effective tool to
find bugs during the development phase. It wasn't designed for
error reporting in released programs.
'assert' is an overloaded term. It seems that you mix assert
as in C/C++, Assert as in DbC, error and exception handling.
In C++ assert is meant and only useful as a bug detection tool
during developement. Released programs may have 'feedback
agents' or other crash reporting mechanisms but that's an
entirely different breed.

It depends. When you deliver shrink wrapped software, you'll
probably want your own, custom assertion facility, in order to
control the error message (and where it goes). When you're
writing software for in house use, of custom software for a
customer, the standard assert is usually largely sufficient.
 
J

James Kanze

Any real-world example for programs that ships with asserts?

All of the routing software at Deutsche Telekom (and that's one
big network); Deutsche Telekom (and its spin-offs, like T Mobil)
require it. All of the network management programs from
Alcatel.

All of our trading software in the bank where I currently work.
(Funny thing about bankers: they prefer a crash to issuing a buy
order when they meant sell.)
Another argument against asserts in production code is
performance: asserts significantly slow down the program (if
they don't you haven't used enough asserts in your code).

That depends a lot on the program. All of the programs I've
worked on in the last 20 or so years have been I/O bound. And
asserts certainly don't slow up disk accesses.
BTW, asserts seem to be a C and C++ only thing. For my work I
mostly use Java and I have neither used nor missed asserts.
They are not necessary in that environment.

That's because Java can't be used in critical systems, when
reliability is important.
 
J

James Kanze

[...]
Sorry, that's mostly bullshit, in the technical meaning of
bullshit. A slowdown due to assertions means you're using
assertions incorrectly, to check too complicated stuff.

Actually, it's more of a misleading half-truth. The reason you
don't use assertions to check too complicated stuff is precisely
because they'll slow things down. Imagine a function which
processes a large set of data, with a post condition that the
data are sorted. I'd certainly like to assert that post
condition, since it's part of my contract. I don't, of course,
because verifying that a large array is sorted *is* too
expensive to leave in production code.

What's probably needed is some sort of multi-level assertions,
depending on run-time, so that you can turn off the expensive
asserts, and leave all of the rest in. Ideally, anyway---in
practice, with a little care, I've never had any real problems
with the current situation.
Namely that there is a runtime cost associated, an engineering
trade-off, and so you really want to have an assertion system
in place where you can shut down the most costly ones.
Checking costly-to-compute invariant of data structure all the
time isn't meaningful after that class or set of classes has
been debugged and tested, but leaving more inexpensive asserts
in place is meaningful, because it can catch remaining logic
bugs.

Exactly. The most important assertions in production code are
generally pre-condition checks. Which are generally not too
expensive.

Of course, it also depends on the application. I've worked on
some very critical applications (locomotive brake systems, for
example), where we did verify the invariants after each
operation. If there's the slightest chance of an error, we want
to detect it immediately, and crash, so that the back-up can
take over. And this behavior is considered critical enough that
if it means paying more for a faster processor, you do it
anyway.
 
R

Roland Pibinger

And I've never seen NDEBUG defined in the invocation line of the
compiler. Only an idiot would do something that stupid. You
define it only when you absolutely need to.

Visual C++ by default defines NDEBUG (/D "NDEBUG") for Release builds
"in the invocation line". I guess it makes no sense to continue that
discussion.
 
J

James Kanze

Visual C++ by default defines NDEBUG (/D "NDEBUG") for Release builds
"in the invocation line".

You mean, no doubt, the IDE. VC++ (the command cl) doesn't
define much of anything by default; you need four or five
options just to get it to understand enough C++ to accept the
standard headers.

Of course, defaults don't mean anything. I've never seen a
compiler do anything useful "by default". (I use some 20 or 30
options with g++ or Sun CC. I suspect that I'd need just as
many if I were to do any serious work with VC++.)

And the Visual C++ is not designed for professional development,
but rather to support the hobby programmer.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Members online

Forum statistics

Threads
474,189
Messages
2,571,016
Members
47,618
Latest member
Leemorton01

Latest Threads

Top