Strange - a simple assignment statement shows error in VC++ but worksin gcc !

D

David Schwartz

I've seen people insert casts just to shut up a compiler. It was very
clear, upon questioning them, that they had little or no idea what the
cast did, they were just following a suggestion contained in the
compiler's messages. If the conversion had consequences that they did
not understand would occur (such as wrapping on overflow or truncation
of the fractional portion of a floating point value), does such a
conversion count as intentional?

In this case, the cast *still* helps. First, it gives them at least
one more opportunity to look at the questionable code -- at the time
when they inserted the cast. Second, it ensures that a code reviewer
will know that a conversion takes place, rather than an implied
conversion which is easily missed.

So even in this case, you are *still* better off with the cast.

The issue is simple -- would you prefer code has subtle, easily missed
consequences or that the code self-document these consequences? The
latter is almost always better.
In the cases I'm thinking of, the programmers had no idea whether the
casts was safer or not, they were just trying to shut up a compiler
warning. They may have thought that the compiler knew best, but they
were most certainly not vouching personally to the safety of the code.
Their argument was "the compiler told me to do this, so it must be
safe". In other words, they were rejecting personal responsibility,
and were holding the compiler (and by implication, the compiler's
vendor) responsible for generating a diagnostic that implied that the
cast was a good idea.. In a couple of those cases, the cast was
definitely NOT safe, for reasons the programmer did not understand.

In that case, it's vital that a reviewer know that this happened. The
problem is the same with or without the cast -- the code does the same
thing that may or may not be safe.

I have yet to hear of any case where you're better off without the
cast. You're just showing other ways the cast might help you.

DS
 
M

Måns Rullgård

David Schwartz said:
Then the cast still does its job. It highlights to the code reviewer
that there was a cast there and the original programmer may or may not
have ensured that the conversion is safe. In the absence of the cast,
your attention might never have been drawn to that issue.

When reviewing code written by people I don't trust the competence of,
I look at *every* line of code. Cast or no cast makes no difference.
 
J

jameskuyper

David said:
In this case, the cast *still* helps. ... ....


In that case, it's vital that a reviewer know that this happened. ...

I was addressing only your assertion that the presence of a cast
always indicates that the corresponding conversion was intentional,
and that its appropriateness was vouched for by the developer. You
further implied that it was ridiculous to even think of there being
any other reason why the cast was written. The cases I summarized are
clear counter-examples to that assertion, and the frequency with which
I've seen such things belies that implication.

The project I work on is constrained by coding standards imposed by
our client which prohibit implicit conversions in delivered code. In
my experience with this prohibition, it makes the code harder to read.
It inhibits useful diagnostics that would have otherwise have been
generated when someone makes the mistake of writing a cast that was
intended to pointlessly duplicate an implicit conversion to satisfy
this prohibition, but which actually performs a conversion that can
only occur explicitly. It makes casts ubiquitous, preventing them from
serving as an indication that something unusually dangerous has been
attempted.
 
K

Keith Thompson

jameskuyper said:
The project I work on is constrained by coding standards imposed by
our client which prohibit implicit conversions in delivered code.

My sympathies.
In
my experience with this prohibition, it makes the code harder to read.
It inhibits useful diagnostics that would have otherwise have been
generated when someone makes the mistake of writing a cast that was
intended to pointlessly duplicate an implicit conversion to satisfy
this prohibition, but which actually performs a conversion that can
only occur explicitly. It makes casts ubiquitous, preventing them from
serving as an indication that something unusually dangerous has been
attempted.

And unless you have a special tool, adding casts doesn't guarantee the
absence of implicit conversions anyway. For example:

int x;
long y;

x = (short)y; /* oops */

Even if the compiler might have warned about the narrowing conversion
in the absence of the cast, adding the cast is likely to inhibit the
warning *and* adds an extra incorrect conversion.
 
D

David Schwartz

When reviewing code written by people I don't trust the competence of,
I look at *every* line of code.  Cast or no cast makes no difference.

Then at least it makes sure that you don't miss that there's a
conversion. That's not much, but it's something.

DS
 
D

David Schwartz

I was addressing only your assertion that the presence of a cast
always indicates that the corresponding conversion was intentional,
and that its appropriateness was vouched for by the developer.

And it does. If a programmer codes a cast, he intends a conversion. If
a programmer requests a conversion, he is vouching for its safety.
This is as true as if a programmer writes "i++;" he is vouching that
it is safe to add one to 'i'. This is axiomatic stuff. There is no
possible counter-argument.
You
further implied that it was ridiculous to even think of there being
any other reason why the cast was written.

Right, but I'm not claiming that's always the case. The programmer
could have put it there because he thought it was necessary. But it
still serves the job of showing that the programmer knows there's a
conversion and tipping the reviewer that he needs to review the
conversion.

Simply put, it reduces the possible cases. There is, without the cast,
one possible case that is not the case -- that the programmer had no
idea there was a conversion and therefore never considered its safety.
The cast rules out this case.
The cases I summarized are
clear counter-examples to that assertion, and the frequency with which
I've seen such things belies that implication.

They are not counter-examples. If a programmer writes '(unsigned) j'
he is vouching for the safety of the cast, just as when he writes any
code he is vouching for it. Just because he was mistaken in doing so
doesn't change the fact that he did so.
The project I work on is constrained by coding standards imposed by
our client which prohibit implicit conversions in delivered code. In
my experience with this prohibition, it makes the code harder to read.
It inhibits useful diagnostics that would have otherwise have been
generated when someone makes the mistake of writing a cast that was
intended to pointlessly duplicate an implicit conversion to satisfy
this prohibition, but which actually performs a conversion that can
only occur explicitly. It makes casts ubiquitous, preventing them from
serving as an indication that something unusually dangerous has been
attempted.

That's strange, because there really should not be that many
conversions. I agree that prohibiting all implicit conversions is not
a good job of striking the right balance. Almost any "never" or
"always" rule will cause significantly sub-standard code in at least
some cases.

It's not enough to have a coding rule. You have to understand the
reason for the rule and know when following the rule doesn't vindicate
the reason for the rule. In those cases, you should break the rule.

DS
 
D

David Schwartz

    int x;
    long y;

    x = (short)y; /* oops */

Even if the compiler might have warned about the narrowing conversion
in the absence of the cast, adding the cast is likely to inhibit the
warning *and* adds an extra incorrect conversion.

Right, but this sticks out like a sore thumb.

And there's no automatic rule that will fix this, since the error is
in the logic the programmer used to decide what code to write, not the
written code. If the programmer requested a cast to 'short' and such a
conversion is not safe, that is a basic programmer error that cannot
be fixed by anything but debug and review.

Leaving out the cast here only works by accident.

DS
 
M

Måns Rullgård

David Schwartz said:
Then at least it makes sure that you don't miss that there's a
conversion. That's not much, but it's something.

You misunderstand. The cast does not in any way change the way I
review the code. I will check the type of every variable and make
sure it is used safely. It would not be a code review otherwise.
 
M

Måns Rullgård

David Schwartz said:
Right, but this sticks out like a sore thumb.

And there's no automatic rule that will fix this, since the error is
in the logic the programmer used to decide what code to write, not the
written code. If the programmer requested a cast to 'short' and such a
conversion is not safe, that is a basic programmer error that cannot
be fixed by anything but debug and review.

Leaving out the cast here only works by accident.

What if 'x' was originally declared as short and later, perhaps
because the range was found insufficient, changed to int? Without the
cast, the code will still be correct. With the cast, it will now be
incorrect.
 
D

David Schwartz

You misunderstand.  The cast does not in any way change the way I
review the code.  I will check the type of every variable and make
sure it is used safely.  It would not be a code review otherwise.

Then if there is no error, it does nothing. If there is an error, it
allows you to tell if it was due to the programmer not realizing there
was a conversion or not realizing that the conversion was safe.
Otherwise, you'd have no idea, but it doesn't matter very much.

I would still argue that it makes the code, at least in some cases,
more maintainable. Not everyone who might modify the code is as
thorough as your code review, and they might miss the conversion.

DS
 
D

David Schwartz

What if 'x' was originally declared as short and later, perhaps
because the range was found insufficient, changed to int?  Without the
cast, the code will still be correct.  With the cast, it will now be
incorrect.

Either way it only works by accident. If the type of 'x' is changed,
all code that accesses 'x' has to be reviewed to make sure that change
in type doesn't break that code. If it happens not to, that's just an
accident.

But I agree that people could disagree and see this as a downside of
explicit casts. I'd only point out that if you are changing the type
of a variable and are reviewing all the code that accesses it,
explicit casts are much harder to miss than implicit conversions. And
if you don't miss either, then there's no difference.

DS
 
G

Guest

I've seen people insert casts just to shut up a compiler. It was very
clear, upon questioning them, that they had little or no idea what the
cast did, they were just following a suggestion contained in the
compiler's messages. If the conversion had consequences that they did
not understand would occur (such as wrapping on overflow or truncation
of the fractional portion of a floating point value), does such a
conversion count as intentional?

I've seen this perpetrated

/* original code */
char *c = 's';
printf ("%s\n", c);

/* 'fixed' code */
char *c = (char*)'s';
printf ("%s\n", c);
 
J

James Kuyper

David said:
Right, but this sticks out like a sore thumb.

Not when there's several hundred similar lines in the same program. A
policy of making all implicit conversions explicit will produce
precisely that situation.

This is particularly hard to notice when the declarations of x and y are
far away, possibly even in a header file.
 
J

James Kuyper

David said:
And it does. If a programmer codes a cast, he intends a conversion. If
a programmer requests a conversion, he is vouching for its safety.
This is as true as if a programmer writes "i++;" he is vouching that
it is safe to add one to 'i'. This is axiomatic stuff. There is no
possible counter-argument.

So, a person who writes a cast to shut up a warning, without
understanding that the resulting conversion will have consequences he
wasn't aware of, can be said to have intended that those consequences occur?

A person who would vehemently deny possession of sufficient
understanding to decide whether the code is safe, who would tell you he
put the cast in only because the compiler's diagnostics suggested
putting it in, can still be said to have vouched for the safety of the code?

You use the verbs "intend" and "vouch" in a manner quite inconsistent
with my understanding of their meaning. I can't quite see the usefulness
of defining those terms in any way consistent with the way that you use
them. Having defined them in that way, they no longer support the
conclusions you use them for.
Simply put, it reduces the possible cases. There is, without the cast,
one possible case that is not the case -- that the programmer had no
idea there was a conversion and therefore never considered its safety.
The cast rules out this case.

Every one of the cases I was referring to involved a cast written by a
programmer who had no idea that there was a conversion (or at least,
that it was a conversion with non-trivial consequences), and who
therefore had not in fact considered whether or not it was safe.
They are not counter-examples. If a programmer writes '(unsigned) j'
he is vouching for the safety of the cast, just as when he writes any
code he is vouching for it. Just because he was mistaken in doing so
doesn't change the fact that he did so.

If someone signs a document he does not understand, and it contains
assertions that he would have disagreed with, had he understood them,
does this mean that he now agrees with those assertions? Writing his
signature on the document does not magically change his mind to match
the words of that document.

Similarly, inserting a cast in code does not magically change the
developers intention to match what will actually happen. A convention of
inserting casts only when you are willing to vouch for their safety does
not magically bring people into conformance with that convention who
wrote their code without being aware of it, or without a desire to
conform to it. I do not defend the competence of these programmers; I
merely point out that they do exist, and code written by such people has
to be dealt with occasionally.
That's strange, because there really should not be that many
conversions.

In the typical data path for my programs, data is read from a file into
objects with typedefs associated with the HDF library like float32. It
is then passed to the SDP Toolkit library which requires that they have
the type PGSt_double (it is not essential that you know what those
libraries are; it is sufficient that you understand that they are third
party libraries that define platform-dependent typedefs for the types
used in the library's interfaces). For some purposes, the data is passed
to C standard library functions which require that it be of type
'double'. When all processing is complete, the data is written to the
output file with the type float32 again.

That's a minimum of three casts for every piece of data flowing through
the program. One of the first things I did when I became responsible for
these programs was to rearrange it so that the minimum was actually
achieved; the original code performed at least twice the minimum number
of conversions. It's not uncommon for half the lines that refer to a
give piece of data to involve converting it from one type to another.
That's not true in the science code, but actual science code constitutes
only about 10% of my programs; the remaining 90% is involved with
pushing data around from one place to another.
... I agree that prohibiting all implicit conversions is not
a good job of striking the right balance. Almost any "never" or
"always" rule will cause significantly sub-standard code in at least
some cases.

It's not enough to have a coding rule. You have to understand the
reason for the rule and know when following the rule doesn't vindicate
the reason for the rule. In those cases, you should break the rule.

That would violate our contract with our client. It's not an option.
 
R

Rainer Weikusat

David Schwartz said:
Then the cast still does its job. It highlights to the code reviewer
that there was a cast there and the original programmer may or may not
have ensured that the conversion is safe.

It 'highlights' that someone wrote a cast which instructed the
compiler to do something it was required to do anyway. The definition
of 'simple assignment' is

In simple assignment (=), the value of the right operand is
converted to the type of the assignment expression and
replaces the value stored in the object designated by the left
operand.
[6.5.16.1|2]

Any uncertaintities about that hint at a lack of knowledge somewhere,
whose 'proper organizational remedies' would be instruction and/or
training.
 
D

David Schwartz

Not when there's several hundred similar lines in the same program. A
policy of making all implicit conversions explicit will produce
precisely that situation.

I agree. That's why I think such a policy would be silly.

In a 450,000 line project that follows the rule I'm advocating, there
are about 1,500 casts that are not strictly needed. IMO, the vast
majority of them make obvious things in the code that would otherwise
not be obvious on casual inspection.
This is particularly hard to notice when the declarations of x and y are
far away, possibly even in a header file.

All the more reason it's important to do something to make it more
noticeable.

DS
 
D

David Schwartz

So, a person who writes a cast to shut up a warning, without
understanding that the resulting conversion will have consequences he
wasn't aware of, can be said to have intended that those consequences occur?

Yes. You are permitted to assume a person intended the obvious
consequences of his code. If you want to argue that you can't do that,
then you can't make any inferences about intent at all.
A person who would vehemently deny possession of sufficient
understanding to decide whether the code is safe, who would tell you he
put the cast in only because the compiler's diagnostics suggested
putting it in, can still be said to have vouched for the safety of the code?

Yes. Erroneously so, but yes. If you code something explicitly, you
are vouching for its safety.
You use the verbs "intend" and "vouch" in a manner quite inconsistent
with my understanding of their meaning. I can't quite see the usefulness
of defining those terms in any way consistent with the way that you use
them. Having defined them in that way, they no longer support the
conclusions you use them for.

I don't follow you.

If a programmer writes "i++;" can you conclude that he intended to add
one to the value of 'i'? If not, then I would argue that you cannot
infer anything about intent at all.

If a programmer writes "i++;" can you conclude that he is vouching for
adding one to 'i' being correct and safe? If not, then you are arguing
that code can never vouch.

You can certainly make those arguments, but they're not how reasonable
people actually reason about code. They're simply not useful.

In any rational coding environment, programmers are assumed to intend
the obvious consequences of their code. Programmers are assumed to
vouch that the the obvious things they've done are safe. They may be
wrong about the consequences being safe or what the consequences are
-- but if they don't intend the obvious consequences, they're simply
so broken they aren't doing what we call programming.
Every one of the cases I was referring to involved a cast written by a
programmer who had no idea that there was a conversion (or at least,
that it was a conversion with non-trivial consequences), and who
therefore had not in fact considered whether or not it was safe.

Right, but they're as absurd as the programmer who writes "i++;" and
doesn't mean to add one to 'i'. They simply don't *mean* anything.
They're bizarre outliers that convey nothing except that stupid people
can do stupid things.

The only think you can generalize from them is that you can never
trust anything. That's useful as a mantra for code review, but it's no
help in comparing X to Y. It simply doesn't weigh one way or the other
in a discussion of which of two techniques are better.
If someone signs a document he does not understand, and it contains
assertions that he would have disagreed with, had he understood them,
does this mean that he now agrees with those assertions? Writing his
signature on the document does not magically change his mind to match
the words of that document.

It means he now vouches for them. That a person may sign something
without reading it may be an argument against signing at all, but it
certainly doesn't tell us how to write contracts better.

These bizarre "anyone might do anything" arguments do nothing
whatsoever to address the issue we are discussing, which is whether X
is better than Y.
Similarly, inserting a cast in code does not magically change the
developers intention to match what will actually happen. A convention of
inserting casts only when you are willing to vouch for their safety does
not magically bring people into conformance with that convention who
wrote their code without being aware of it, or without a desire to
conform to it. I do not defend the competence of these programmers; I
merely point out that they do exist, and code written by such people has
to be dealt with occasionally.

Idiots will screw up anything. Again, this doesn't do anything for
this dicussion.
In the typical data path for my programs, data is read from a file into
objects with typedefs associated with the HDF library like float32. It
is then passed to the SDP Toolkit library which requires that they have
 the type PGSt_double (it is not essential that you know what those
libraries are; it is sufficient that you understand that they are third
party libraries that define platform-dependent typedefs for the types
used in the library's interfaces). For some purposes, the data is passed
to C standard library functions which require that it be of type
'double'. When all processing is complete, the data is written to the
output file with the type float32 again.

That's a minimum of three casts for every piece of data flowing through
the program. One of the first things I did when I became responsible for
these programs was to rearrange it so that the minimum was actually
achieved; the original code performed at least twice the minimum number
of conversions. It's not uncommon for half the lines that refer to a
give piece of data to involve converting it from one type to another.
That's not true in the science code, but actual science code constitutes
only about 10% of my programs; the remaining 90% is involved with
pushing data around from one place to another.

That's an unfortunate situation to be in, but certainly an exceptional
one. In that case, explicit casts are probably quite awful. As I've
said though, I never argued that a cast should be placed any time
there's a conversion. Specifically, I never said they should be placed
on conversions that are always safe, do not generate a warning, and do
not require review because the ranges are the same.
That would violate our contract with our client. It's not an option.

Well, that's unfortunate for you. You're stuck with a sub-optimal
rule. But that really doesn't relate to what I'm arguing about here.

DS
 
R

Rainer Weikusat

David Schwartz said:
I agree. That's why I think such a policy would be silly.

In a 450,000 line project that follows the rule I'm advocating, there
are about 1,500 casts that are not strictly needed. IMO, the vast
majority of them make obvious things in the code that would otherwise
not be obvious on casual inspection.

Since 'conversion to the type of the assignment expression' is part of
the definition of 'simple assignment' (which includes 'function calls',
cf

If the expression that denotes the called function has a type
that does include a prototype, the arguments are implicitly
converted, as if by assignment, to the types of the
corresponding parameters,
[6.5.2.2|7])

there is no way this conversion could not be 'obvious' without
decoration except if the person reading the code hasn't bothered
to learn the fundaementals of the language the code is written
in. While this is not entirely unlikely[*], the solution to "but I
don't know how" is "learn it". People writing (or 'auditing') code can
directly (and indirectly) cause much more harm than people confusing
signs or irregular verbs in university or school exams and this
cavalier attidude of "hell, you don't really expect me to learn how to
do what I am trying to accomplish here" (after all, I have passed all
my math exams and this should really be enough for life) is completely
out of place here.

[*] In Germany, you are not allowed to change a tyre for fee except if
having spent three years on learning about 'repairing cars', final
test in both theory and practice included. Why the hell should someone
with a degree in physics and an ego of the size of a moderate asteroid
be allowed to mess around with industrial control software
without a comparable qualification?
 
J

James Kuyper

David said:
All the more reason it's important to do something to make it more
noticeable.

The "this" that I was referring to was the defect introduced either by
writing the incorrect cast or by changing the types of either x or y so
that the cast becomes incorrect. What is the "it" that you're suggesting
should be made more noticeable? What mechanism have you suggested to
make that defect more noticeable? It's less noticeable without the cast
only in the sense that it is non-existent without the cast, which
doesn't sound like a problem to me.
 
J

James Kuyper

David said:
Yes. You are permitted to assume a person intended the obvious
consequences of his code. If you want to argue that you can't do that,
then you can't make any inferences about intent at all.

What's obvious depends upon the skill level of the author, and any
conclusions you should reach consider both the possibility of intent vs.
the possibility of a low skill level. In my experience, such code is
perpetrated mainly by people with sufficiently low skill level that they
did not find the consequences obvious; I don't think I've ever seen such
code written by people for the reason you've given. So you'll have to
forgive me if I favor the "low skill level" explanation when I see such
code.
I don't follow you.

Funny - that's a good summary of what I said.

....
If a programmer writes "i++;" can you conclude that he intended to add
one to the value of 'i'? If not, then I would argue that you cannot
infer anything about intent at all.

Again, that depends upon the skill level. I've seen people write code
who had a sufficiently low skill level that they wrote i++, expecting it
to behave as if it were written as "i+1" or "++i". Therefore, yes, I
would not automatically assume that i++ was intended just because it was
written.
If a programmer writes "i++;" can you conclude that he is vouching for
adding one to 'i' being correct and safe? If not, then you are arguing
that code can never vouch.

Correct. People can vouch. Code can contain comments that vouch. Plain
code only "does", it never "vouches". Any code, no matter how obvious
it's intent may seem, might have been written with a different intent in
mind - only examination of an explicit statement of intent (such as
design documentation) can justify reaching any conclusions about intent.

....
Right, but they're as absurd as the programmer who writes "i++;" and
doesn't mean to add one to 'i'. They simply don't *mean* anything.
They're bizarre outliers that convey nothing except that stupid people
can do stupid things.

Including write code that I am responsible for the correct behavior of.
I inherited (literally - the author died) responsibility for a fairly
large and undocumented body of code with bizarre features only a little
less silly than the ones being discussed here. It works, for the most
part, though in many cases the fact that it works is quite literally
accidental. I'm responsible for keeping it working. It's a lovely job. :-}

....
Idiots will screw up anything. Again, this doesn't do anything for
this dicussion.

The screw-ups of idiots need to be dealt with. And there's no guarantee
that the next piece of code you look at might not be an example of that.

....
Well, that's unfortunate for you. You're stuck with a sub-optimal
rule. But that really doesn't relate to what I'm arguing about here.

The relevance is that you're promoting a similarly sub-optimal rule; and
criticizing those who choose not to follow it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,236
Members
46,822
Latest member
israfaceZa

Latest Threads

Top