Strange - a simple assignment statement shows error in VC++ but worksin gcc !

K

Keith Thompson

Ezekiel said:
Don't cry... sometimes the truth hurts. Nobody believed your nonsense
about sub-millisecond cluster time synchronization and nobody believes your
baloney about compilers.

And nobody in comp.lang.c is interested in your argument. Please
consider trimming the newsgroups line.
 
T

The Lost Packet

chrisv said:
KF the trolling POS.

done. I'm sick of the fucking idiot.

--
Disclaimer: No fluffy warm creatures were maimed, dismembered, tortured,
deplumed, discarded, deflowered, dropped, twisted, wrungOut, extended,
respliced, broken, humiliated, irradiated, browbeaten, pickled, deluded,
duped, detained, mishandled, desiccated, bronzed, belittled, coddled,
expelled, deported, imbibed, elected, marginalized, placated,
misrepresented, overworked, underpaid, underappreciated, prepackaged,
overly petted, genetically altered or cloned during the making of this
product, except of course for Bunny and Bear

- IE4 Easter Egg
 
D

Doctor Smith

done. I'm sick of the fucking idiot.

Translation: I got bitch slapped and am now going to run and hide.

Are you sure you are not Liarmutt?

That's usually his MO.
 
K

Kenny McCormack

And nobody in comp.lang.c is interested in your argument. Please
consider trimming the newsgroups line.

I'm interested. Counter-example shown. Your argument: in the toilet.
 
D

David Schwartz

it's down to your compilers - VC++ is a proprietery compiler for C++
from Microsoft, which apart from its own obfuscations is far more strict
than vanilla C compilers such as gcc so unless your code is /perfect/ it
/will/ throw errors.

Good. I want imperfect code to be noticeable. If my code is not
perfect, I need to find it so I can fix it. A compiler that tolerates
it and hopes for the best -- that's a problem.

What makes code imperfect is that it is ambiguous. No compiler can be
sure to do the right thing with ambiguous code. The best it can do is
tell me at compile time. The second best it can do is reliably fail at
run time. The worst thing it can do is hope it got it right and hide
the problems from me.
gcc is more forgiving of untidy code, hence you
will get generally more functional results using it wihtout having to
drastically rewrite your source.

That's great for people who want "generally more functional results".
But I want *reliable* results.

DS
 
K

Keith Thompson

David Schwartz said:
On Feb 19, 8:51 pm, The Lost Packet <[email protected]>
wrote: [...]
gcc is more forgiving of untidy code, hence you
will get generally more functional results using it wihtout having to
drastically rewrite your source.

That's great for people who want "generally more functional results".
But I want *reliable* results.

Is the warning insufficient for that purpose?

gcc does have a number of options that cause it to produce more
diagnostics, and one that turns warnings into errors. (The latter can
be a problem if gcc warns about something that you don't consider to
be serious.)
 
C

Chris Ahlstrom

After takin' a swig o' grog, Keith Thompson belched out
this bit o' wisdom:
David Schwartz said:
On Feb 19, 8:51 pm, The Lost Packet <[email protected]>
wrote: [...]
gcc is more forgiving of untidy code, hence you
will get generally more functional results using it wihtout having to
drastically rewrite your source.

That's great for people who want "generally more functional results".
But I want *reliable* results.

Is the warning insufficient for that purpose?

gcc does have a number of options that cause it to produce more
diagnostics, and one that turns warnings into errors. (The latter can
be a problem if gcc warns about something that you don't consider to
be serious.)

Even if you turn warnings on full bore on both compilers, each one still
catches things the other misses.

I like using gcc and VC both on the same code.
 
D

David Schwartz

Is the warning insufficient for that purpose?

The warning is fine. I can turn warnings into errors if I want. And if
I ignore a warning, that's my fault.
gcc does have a number of options that cause it to produce more
diagnostics, and one that turns warnings into errors.  (The latter can
be a problem if gcc warns about something that you don't consider to
be serious.)

Yep. I think 'gcc' does a really good job at this.

One thing I've noticed is that the things 'gcc' misses, 'vc++' usually
picks up, and vice versa. If you compile/test code with 'gcc', Intel's
C/C++ compiler, and VC++, you catch a lot more problems and potential
problems than any one of them alone.

DS
 
R

Rainer Weikusat

Keith Thompson said:
David Schwartz said:
On Feb 19, 8:51 pm, The Lost Packet <[email protected]>
wrote: [...]
gcc is more forgiving of untidy code, hence you
will get generally more functional results using it wihtout having to
drastically rewrite your source.

That's great for people who want "generally more functional results".
But I want *reliable* results.

Is the warning insufficient for that purpose?

gcc does have a number of options that cause it to produce more
diagnostics, and one that turns warnings into errors. (The latter can
be a problem if gcc warns about something that you don't consider to
be serious.)

gcc warns about perfectly valid C, eg comparing the values of signed
and unsigned integers or use of && and || for flow-control. Some of
the warnings come close to being ludicrous, eg 'variable may be used
uninitialized' (why not just abort the compilation with 'program could
have errors' every now and then, based on a PRNG?). Usually, 'the
variable' will never be uninitialized and this text actually means
'this code was too complex for the compiler data flow analysis'. OTOH,
it generates a lot of useful warnings, too. But unconditionally
turning them into errors is a bad idea: They are warnings for a
reason.
 
F

Flash Gordon

Rainer said:
Keith Thompson said:
David Schwartz said:
On Feb 19, 8:51 pm, The Lost Packet <[email protected]>
wrote: [...]
gcc is more forgiving of untidy code, hence you
will get generally more functional results using it wihtout having to
drastically rewrite your source.
That's great for people who want "generally more functional results".
But I want *reliable* results.
Is the warning insufficient for that purpose?

gcc does have a number of options that cause it to produce more
diagnostics, and one that turns warnings into errors. (The latter can
be a problem if gcc warns about something that you don't consider to
be serious.)

gcc warns about perfectly valid C, eg comparing the values of signed
and unsigned integers or use of && and || for flow-control.

If you don't like some of the warnings a given compiler produces read
the manuals and find out how to disable those warnings. There is no
guarantee you can disable them of course, but with gcc you generally can.
Some of
the warnings come close to being ludicrous, eg 'variable may be used
uninitialized' (why not just abort the compilation with 'program could
have errors' every now and then, based on a PRNG?). Usually, 'the
variable' will never be uninitialized and this text actually means
'this code was too complex for the compiler data flow analysis'. OTOH,
it generates a lot of useful warnings, too.

Whether the warning being spurious or not is normal will very much
depend on the type of code you are writing and how you write it.
Personally I've seen it be correct (or the code complex enough for a
human to have problems proving the the variable is guaranteed to be
initialised, more often than I've seen it be wrong.
But unconditionally
turning them into errors is a bad idea: They are warnings for a
reason.

This all depends. Personally I don't do it, but I can understand why
some people do.
 
E

Ezekiel

Flash Gordon said:
Rainer said:
Keith Thompson said:
On Feb 19, 8:51 pm, The Lost Packet <[email protected]>
wrote:
[...]
gcc is more forgiving of untidy code, hence you
will get generally more functional results using it wihtout having to
drastically rewrite your source.
That's great for people who want "generally more functional results".
But I want *reliable* results.
Is the warning insufficient for that purpose?

gcc does have a number of options that cause it to produce more
diagnostics, and one that turns warnings into errors. (The latter can
be a problem if gcc warns about something that you don't consider to
be serious.)

gcc warns about perfectly valid C, eg comparing the values of signed
and unsigned integers or use of && and || for flow-control.

If you don't like some of the warnings a given compiler produces read the
manuals and find out how to disable those warnings. There is no guarantee
you can disable them of course, but with gcc you generally can.
Some of
the warnings come close to being ludicrous, eg 'variable may be used
uninitialized' (why not just abort the compilation with 'program could
have errors' every now and then, based on a PRNG?). Usually, 'the
variable' will never be uninitialized and this text actually means
'this code was too complex for the compiler data flow analysis'. OTOH,
it generates a lot of useful warnings, too.

Whether the warning being spurious or not is normal will very much depend
on the type of code you are writing and how you write it. Personally I've
seen it be correct (or the code complex enough for a human to have
problems proving the the variable is guaranteed to be initialised, more
often than I've seen it be wrong.

The key word here is 'guaranteed' to be initialized. The compiler doesn't
have the same insight as the person writing the code. The programmer may
know that some function or conditional will always get called to initialize
the variable but the compiler cannot know this.

void func(int i_mode){
int somevar;

if (i_mode == 1)
do_something(&somevar);
else if(i_mode == 2)
do_otherthing(&somevar);

process_result(somevar);
}

Yeah.. this is crap code to illustrate a point but there's numerous more
elegant variations of this. The point is that the programmer may know with
a high degree of certainty that 'somevar' will get initialized prior to
process_result() but the compiler can't possibly assume the same.



This all depends. Personally I don't do it, but I can understand why some
people do.

From my experience we do this for cross platform compatibility reasons. We
support multiple platforms and there are situations where some code
construct gives warnings on some platforms but will result in an error on
another platform. We'll set the compiler to turn the warning into an error
on all platforms to prevent someone from accidently checking in code that
won't build everywhere.


 
J

James Kuyper

Rainer Weikusat wrote:
....
gcc warns about perfectly valid C, eg comparing the values of signed
and unsigned integers or use of && and || for flow-control. Some of
the warnings come close to being ludicrous, eg 'variable may be used
uninitialized' (why not just abort the compilation with 'program could
have errors' every now and then, based on a PRNG?). Usually, 'the
variable' will never be uninitialized and this text actually means
'this code was too complex for the compiler data flow analysis'.

In my experience, that warning message usually means that someone forgot
to do a needed initialization. However, if the fact that the
initialization is unneeded is sufficiently unclear that compiler can't
be sure, it's probably going to be unclear to humans, too.
 
R

Rainer Weikusat

Flash Gordon said:
Rainer said:
Keith Thompson said:
On Feb 19, 8:51 pm, The Lost Packet <[email protected]>
wrote:
[...]
gcc is more forgiving of untidy code, hence you
will get generally more functional results using it wihtout having to
drastically rewrite your source.
That's great for people who want "generally more functional results".
But I want *reliable* results.
Is the warning insufficient for that purpose?

gcc does have a number of options that cause it to produce more
diagnostics, and one that turns warnings into errors. (The latter can
be a problem if gcc warns about something that you don't consider to
be serious.)
gcc warns about perfectly valid C, eg comparing the values of signed
and unsigned integers or use of && and || for flow-control.

If you don't like some of the warnings a given compiler produces read
the manuals and find out how to disable those warnings. There is no
guarantee you can disable them of course, but with gcc you generally
can.

I've already encountered people who simply use gcc -Wall -W -Werror
and then go shopping for all kinds of weird hacks to smuggle correct
code past the compiler (and had to modify build systems for the same
reason). The point was supposed to be that the 'default selections'
for gcc-warnings are not really suitable for treating warnings as
errors, because generally[*], they are intended to rather complain
about something which is ok than to let something slip which might be
not (which I consider to be a sensible policy in this respect).

[*] The exception is that -Wpointer-arith needs to be enabled
if one wants to get a warning when the compiler encounters
arithmetic operations on void *s (which is usually a
'higher-order typo' in code I write).
Whether the warning being spurious or not is normal will very much
depend on the type of code you are writing and how you write
it. Personally I've seen it be correct (or the code complex enough for
a human to have problems proving the the variable is guaranteed to be
initialised, more often than I've seen it be wrong.

In my experience, it has been almost always wrong, the usual reason
being that no sensible default value for some variable exists and
because of this, a value is assigned to it based on 'some condition',
eg, in a switch-case, and it is then later-on used conditionally, with
some code common to all cases in between. And blindly writing a
nonsensical value into the variable in order to get rid of the
warning, as some people occasionally advocate, is something I consider
to be a decidedly bad idea: At best, this accomplishes nothing and at
worst, it causes exactly the same problem that an uninitialized
variable would have caused, too. Code which doesn't work because of
logic errors isn't really an improvement over code which doesn't work
because of using unpredictable values, even the despite the former
being conformant to the letter of the C-standard.
 
B

Boon

Chris said:
Even if you turn warnings on full bore on both compilers, each one still
catches things the other misses.

Do you have an example of MVC catching something GCC misses?

Regards.
 
D

David Schwartz

gcc warns about perfectly valid C, eg comparing the values of signed
and unsigned integers or use of && and || for flow-control. Some of
the warnings come close to being ludicrous, eg 'variable may be used
uninitialized' (why not just abort the compilation with 'program could
have errors' every now and then, based on a PRNG?). Usually, 'the
variable' will never be uninitialized and this text actually means
'this code was too complex for the compiler data flow analysis'. OTOH,
it generates a lot of useful warnings, too. But unconditionally
turning them into errors is a bad idea: They are warnings for a
reason.

Until artificial intelligence reaches the point where compilers can
understand what our code is supposed to do, you will have three
choices:

1) Make the compiler ignore things that might be errors. This will
help to prevent a serious problem from getting buried in a lot of
noise. But it can also hide serious defects.

2) Make the compiler warn you if it sees something that might be
suspicious. This will generate a lot of noise, and you'll get the same
warnings every time you compile. After a while, you'll know which
warnings to ignore, but run the risk of accidentally ignoring a
warning that looked familiar but was actually serious.

3) Design your code so that the compiler will understand it. When you
do things, even perfectly legal things, that the compiler thinks might
be mistakes, indicate to the compiler that you meant to do this. (For
example, put an explicit cast in rather than requiring an implicit
one.)

IMO, 3 is the best choice. It allows you to take new warnings
seriously and often makes your code more straightforward and therefore
easier to understand and maintain.

DS
 
R

Rainer Weikusat

David Schwartz said:
Until artificial intelligence reaches the point where compilers can
understand what our code is supposed to do, you will have three
choices:


[...]

I apologize for completely ignoring this but I don't want to start yet
another flamewar between 'artist' and 'technician' programmers. IMO,
what has no inherent meaning does not belong into source code. YMMV.
 
U

Ulrich Eckhardt

Ezekiel said:
The compiler doesn't have the same insight as the person writing
the code. The programmer may know that some function or conditional
will always get called to initialize the variable but the compiler
cannot know this.

void func(int i_mode){
int somevar;

if (i_mode == 1)
do_something(&somevar);
else if(i_mode == 2)
do_otherthing(&somevar);

process_result(somevar);
}

Yeah.. this is crap code to illustrate a point but there's numerous more
elegant variations of this. The point is that the programmer may know with
a high degree of certainty that 'somevar' will get initialized prior to
process_result() but the compiler can't possibly assume the same.

I'm wondering what those more elegant variations of this are. The code above
will yield a warning from a good compiler and that is a good thing. The fix
here is not to initialise 'somevar' with a default value but to actually
document clearly to both the next programmer and the compiler that it can't
be uninitialised:

void func(int i_mode){
assert((i_mode==1) || (i_mode==2));

int somevar;
if (i_mode == 1)
do_something(&somevar);
else
do_otherthing(&somevar);

process_result(somevar);
}

Alternatively, add an assert(false) in an additional else-branch. Both
measures will remove the warning and improve code quality, neither of them
changes the behaviour. That's IMHO a win-win situation.
We support multiple platforms and there are situations where some code
construct gives warnings on some platforms but will result in an error on
another platform. We'll set the compiler to turn the warning into an error
on all platforms to prevent someone from accidently checking in code that
won't build everywhere.

I prefer treating warnings as guidelines, so that a developer can make an
educated decision what to do with it. For the mere "it must compiler on
other platforms", we have a nightly build that will tell everyone in the
morning what modules failed to compile. Fixing those errors then is
typically a breeze.

cheers

Uli
 
K

Keith Thompson

David Schwartz said:
Until artificial intelligence reaches the point where compilers can
understand what our code is supposed to do, you will have three
choices:

1) Make the compiler ignore things that might be errors. This will
help to prevent a serious problem from getting buried in a lot of
noise. But it can also hide serious defects.

2) Make the compiler warn you if it sees something that might be
suspicious. This will generate a lot of noise, and you'll get the same
warnings every time you compile. After a while, you'll know which
warnings to ignore, but run the risk of accidentally ignoring a
warning that looked familiar but was actually serious.

3) Design your code so that the compiler will understand it. When you
do things, even perfectly legal things, that the compiler thinks might
be mistakes, indicate to the compiler that you meant to do this. (For
example, put an explicit cast in rather than requiring an implicit
one.)

IMO, 3 is the best choice. It allows you to take new warnings
seriously and often makes your code more straightforward and therefore
easier to understand and maintain.

You were doing great until point 3. Adding a cast to silence a
warning is almost always a bad idea. (Incidentally, there's no such
thing as an "implicit cast". A cast is an operator that specifies a
conversion; you can have an implicit conversion.)

Do you have an example where adding a cast to perfectly legal code
silences a warning?
 
E

Ezekiel

Keith Thompson said:
You were doing great until point 3. Adding a cast to silence a
warning is almost always a bad idea. (Incidentally, there's no such
thing as an "implicit cast". A cast is an operator that specifies a
conversion; you can have an implicit conversion.)

Do you have an example where adding a cast to perfectly legal code
silences a warning?


sun:~>g++ --version
g++ (GCC) 3.4.1
Copyright (C) 2004 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

sun:~>g++ foo.cpp
foo.cpp: In function `int main(int, char**)':
foo.cpp:6: error: invalid conversion from `void*' to `char*'
sun:~>

6: char* buff = malloc(512);
 
J

James Kuyper

Ezekiel wrote:
....
The key word here is 'guaranteed' to be initialized. The compiler doesn't
have the same insight as the person writing the code. The programmer may
know that some function or conditional will always get called to initialize
the variable but the compiler cannot know this.

void func(int i_mode){
int somevar;

if (i_mode == 1)
do_something(&somevar);
else if(i_mode == 2)
do_otherthing(&somevar);

process_result(somevar);
}

Yeah.. this is crap code to illustrate a point but there's numerous more
elegant variations of this. The point is that the programmer may know with
a high degree of certainty that 'somevar' will get initialized prior to
process_result() but the compiler can't possibly assume the same.

You will need a better example to make your point. Either the second if
is redundant, or initialization of somevar is not guaranteed. If you
remove the second if(), a reasonably sophisticated compiler will
recognize, at least, that somevar might have been initialized.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,995
Messages
2,570,233
Members
46,820
Latest member
GilbertoA5

Latest Threads

Top