#define with semicolon

S

Snit

cc stated in post
(e-mail address removed) on 7/15/11
4:12 AM:
It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet thing.
That was supposed to be a joke for others reading, but one person was
very upset and called me a liar.

Who was very upset? I merely noted how you were not honest in your claims,
just as you were not honest in your claims about me. But I do not read all
posts... maybe your lying about this, esp. when tied to your other lying,
really did get someone "very upset". Clearly you are addicted to lying.
Heck, just in the last 24 hours you have claimed, of me:

cc:
------
Just so everyone else knows, Snit has admitted he can't tell
if what he's saying is his opinion or a fact, and he also
does not know the definition of words he uses.
-----

Which I never did. You went so far as to lie and attribute "quotes" to me
which I never said. Here are just some examples:

cc:
-----
"I don't know the difference between opinion and fact." - Snit
-----
"I'm a mind reader." - Snit
-----

Of course, as noted, I never said such things. And your creating quotes for
people includes people outside of Usenet, such as when you denied there were
any general principles in UI design (of course there are!) and when you
could find no support for your denial, nor could you counter the evidence of
your error, you created this quote:

cc:
-----
"There are no general principles in HCI." - John M. Carroll
-----

But, of course, you never were able to provide a source for this quote. You
just made it up - and then insisted he emailed you and denounced his public
claims.

And now you are pulling your trolling and your lies into another forum.
Why? Why not keep your lies about me and about others in COLA... or just
stop posting lies at all?
So no, it wasn't "a very good
friend." Also, they seem to have an issue with the way I presented the
situation. Here is the full post:

"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'

Correct - but in reality what I actually do is exploit that to make it
intentionally fail!

e.g. I could easily write


if(SMALL) { do something } else { do something else }


That is bad programming - for the most part, I know I would never
write if(SMALL) ... because if I set SMALL to 2,3,4, then everything
is OK when configuring the software, but if accidentally set SMALL to
0 the execution of the if() statement will change and that would have
been an unintentional side effect.

If I accidentally wrote the code with if(SMALL) it will not fail
especially hard to spot the mistake if it is buried in a complex
formula. And there is no warning of impending doom.

So by putting semicolon in #define SMALL 1; I've made sure on
compiling it it is guaranteed to fail when used out of context."

So that's the whole quote (of which I see no difference in what I said
before), so if you feel differently about it being poor coding
practice I would like to hear why again. Also I'm sorry I jokingly
called someone I don't know, my very good friend. Thanks.

Are you also sorry about your lies about me? Just curious.
 
K

Keith Thompson

Roberto Waltman said:
I am tempted to do that often, because with some compilers this,

while (1) { ... }

generates a warning about the "expression being constant", while your
example is accepted silently.

Then just write

for (;;) { ... }

The EVER macro adds nothing but cuteness (which can be good in some
contexts, but isn't in this one). "for (;;)" is a sufficiently
familiar idiom that it doesn't need to be hidden behind a macro.

And I suggest that those compilers could stand some improvement
in their errrror messages. Warning about constant conditions is
reasonable, but "while (1)" should be treated as a special case.

Do those same compilers warn about "do { ... } while (0)", which
is commonly used in macros?
 
K

Keith Thompson

cc said:
It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet thing.
That was supposed to be a joke for others reading, but one person was
very upset and called me a liar. So no, it wasn't "a very good
friend." Also, they seem to have an issue with the way I presented the
situation. Here is the full post:

"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'

Correct - but in reality what I actually do is exploit that to make it
intentionally fail!

e.g. I could easily write


if(SMALL) { do something } else { do something else }
[snip]

I don't recall anyone *here* taking issue with your characterization,
much less calling you a liar. If someone did so elsewhere, I fail
to see why you should bring it here, or why anyone here should care.

But ok, by your own description you misstated your relationship with the
person who made the suggestion. You describe that misstatement as a
joke, but you made it in a context where nobody would have any reason to
think it wasn't serious. I certainly had no reason to doubt that the
person in question is "a very good friend" of yours (and very little
reason to care).

With regard to the technical issue, I'd say that

#define SMALL 1;

is almost certainly a very bad idea. There might be a valid reason
to reject the use of SMALL in some contexts, but that's not the way
to do it. As has been pointed out several times, code that refers to
SMALL assuming it's defined in a more conventional way could either
quietly work as intended, or quietly give incorrect results, or fail
with a syntax error (and a message that's likely to be misleading).

It would help to know exactly what contexts SMALL is *intended*
to be used in, and what contexts are to be discouraged.

I do not mean this to be a criticism of the person who made the
suggestion, since we don't know the context, and I don't particularly
trust your characterization of it.

Apparently the person in question posts as "Snit", and has recently
posted in this thread. If Snit would care to discuss the technical
issues, I'd be interested. Perhaps there are good reasons for the
macro that I haven't thought of -- or perhaps the macro definition
isn't as you've presented it. But if it's going to devolve into
an argument about who lied about whom in some other forum, perhaps
you could keep it in that other forum rather than spreading it here.
 
C

cc

It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet thing.
That was supposed to be a joke for others reading, but one person was
very upset and called me a liar. So no, it wasn't "a very good
friend." Also, they seem to have an issue with the way I presented the
situation. Here is the full post:
"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'
Correct - but in reality what I actually do is exploit that to make it
intentionally fail!
e.g. I could easily write
  if(SMALL) { do something } else { do something else }

[snip]

I don't recall anyone *here* taking issue with your characterization,
much less calling you a liar.  If someone did so elsewhere, I fail
to see why you should bring it here, or why anyone here should care.

But ok, by your own description you misstated your relationship with the
person who made the suggestion.  You describe that misstatement as a
joke, but you made it in a context where nobody would have any reason to
think it wasn't serious.  I certainly had no reason to doubt that the
person in question is "a very good friend" of yours (and very little
reason to care).

It was an inside joke, and irrelevant to the code anyway.
With regard to the technical issue, I'd say that

    #define SMALL 1;

is almost certainly a very bad idea.  There might be a valid reason
to reject the use of SMALL in some contexts, but that's not the way
to do it.  As has been pointed out several times, code that refers to
SMALL assuming it's defined in a more conventional way could either
quietly work as intended, or quietly give incorrect results, or fail
with a syntax error (and a message that's likely to be misleading).

It would help to know exactly what contexts SMALL is *intended*
to be used in, and what contexts are to be discouraged.

I gave all the info I had. I was looking more for what contexts would
#define SMALL 1; be a good idea.
I do not mean this to be a criticism of the person who made the
suggestion, since we don't know the context, and I don't particularly
trust your characterization of it.

Apparently the person in question posts as "Snit", and has recently

Nope, it was someone who posts as "7." Snit is delusional.
posted in this thread.  If Snit would care to discuss the technical
issues, I'd be interested.  Perhaps there are good reasons for the
macro that I haven't thought of -- or perhaps the macro definition
isn't as you've presented it.  But if it's going to devolve into

The macro definition is as I presented it, and I was looking for good
reasons to have the macro that way as well. I told this "7" it sounded
like bad practice, and I just wanted to verify that I wasn't missing
anything either.
an argument about who lied about whom in some other forum, perhaps
you could keep it in that other forum rather than spreading it here.

I thought I asked a legitimate C question (well style question at
least), then followed it up with the exact post I had a question about
and little more context. Although I don't see why anyone should care
if "7" is my friend or not. I don't know why Snit chose to go nuts
here, but it wasn't my intention. I apologize on behalf of Snit.
 
S

Snit

Keith Thompson stated in post (e-mail address removed) on 7/15/11 9:23
AM:
cc said:
It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet thing.
That was supposed to be a joke for others reading, but one person was
very upset and called me a liar. So no, it wasn't "a very good
friend." Also, they seem to have an issue with the way I presented the
situation. Here is the full post:

"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'

Correct - but in reality what I actually do is exploit that to make it
intentionally fail!

e.g. I could easily write


if(SMALL) { do something } else { do something else }
[snip]

I don't recall anyone *here* taking issue with your characterization,
much less calling you a liar. If someone did so elsewhere, I fail
to see why you should bring it here, or why anyone here should care.

He insisted people would want to know about his mischaracterization of his
relationship with 7. I told him you folks likely would not - and, at first,
he wanted *me* to tell you about his misrepresentation. I told him the
whole idea was silly.

This really comes from a fairly long set of debates where cc has been busted
making all sorts of claims about me and others. He fabricates quotes when
he knows he has lost a Usenet debate. Some examples:

-----
"I have taken HCI classes on Making GUIs Pretty." - Snit
-----
"I am a mind reader." - Snit
-----
"I admit I have never taken any HCI classes." - Snit
-----
"There are no general principles in HCI." - John M. Carroll
-----
"I don't know the difference between opinion and fact." - Snit
-----

But, whatever.
But ok, by your own description you misstated your relationship with the
person who made the suggestion. You describe that misstatement as a
joke, but you made it in a context where nobody would have any reason to
think it wasn't serious. I certainly had no reason to doubt that the
person in question is "a very good friend" of yours (and very little
reason to care).

Exactly. He lied to you - but about something that does not really matter.
It is weird... why tell the lie at all? But then, once he did, why make an
issue out of it?
With regard to the technical issue, I'd say that

#define SMALL 1;

is almost certainly a very bad idea. There might be a valid reason
to reject the use of SMALL in some contexts, but that's not the way
to do it. As has been pointed out several times, code that refers to
SMALL assuming it's defined in a more conventional way could either
quietly work as intended, or quietly give incorrect results, or fail
with a syntax error (and a message that's likely to be misleading).

It would help to know exactly what contexts SMALL is *intended*
to be used in, and what contexts are to be discouraged.

I do not mean this to be a criticism of the person who made the
suggestion, since we don't know the context, and I don't particularly
trust your characterization of it.

Makes sense.
Apparently the person in question posts as "Snit", and has recently
posted in this thread. If Snit would care to discuss the technical
issues, I'd be interested.

I am not the one who even posted the technical question... and am not a
programmer (do some scripting, but not a programmer).
Perhaps there are good reasons for the macro that I haven't thought of -- or
perhaps the macro definition isn't as you've presented it. But if it's going
to devolve into an argument about who lied about whom in some other forum,
perhaps you could keep it in that other forum rather than spreading it here.

Good thought.
 
S

Snit

cc stated in post
(e-mail address removed) on 7/15/11
10:37 AM:

....
It was an inside joke, and irrelevant to the code anyway.

Why would you tell an "inside joke" to an "outside" group? That makes no
sense. And, according to you, not only did I call you on it, but someone
became "very upset" with you about it (you have not said who).

....
Nope, it was someone who posts as "7." Snit is delusional.

Why bad-mouth me in multiple groups? I mean, really, can you actually point
to my doing anything wrong or dishonest? I can easily point to you just
flat our fabricating quotes and attributing them to me and others. Heck,
from *today*:

-----
"I have taken HCI classes on Making GUIs Pretty." - Snit
-----
"I am a mind reader." - Snit
-----
"I admit I have never taken any HCI classes." - Snit
-----
"There are no general principles in HCI." - John M. Carroll
-----
"I don't know the difference between opinion and fact." - Snit
-----
"Okay, I admit it. I'm dumb." - Snit
-----
"Hadron and I have been in a relationship for quite some
time now." - Snit
----

About me and about others. All lies. And then you call me "delusional" as
you run around lying about me, Carroll and 7 in a group where they are not
even relevant. Just bizarre.
Although I don't see why anyone should care if "7" is my friend or not.

Nor I... which is why I specifically told you I was not going to bring your
BS debates to this forum (even when you asked me to!) and encouraged you not
to.
I don't know why Snit chose to go nuts here, but it wasn't my intention. I
apologize on behalf of Snit.

You are the one who came to this group and started telling lies about me and
about others. Why?
 
K

Keith Thompson

Snit said:
cc stated in post
(e-mail address removed) on 7/15/11
10:37 AM:

...

Why would you tell an "inside joke" to an "outside" group? That makes no
sense. And, according to you, not only did I call you on it, but someone
became "very upset" with you about it (you have not said who).

Snit, perhaps you could discuss this with him somewhere else (or
at least refrain from doing so here). We knew nothing of your
involvement in this before you started posting here.

You may well have legitimate greivances with "cc", but unless
they're relevant to the C programming language, anyone here who's
interested in hearing about them can easily go to wherever this
dispute started and read about them there.
 
S

Snit

Keith Thompson stated in post (e-mail address removed) on 7/15/11 1:26
PM:
Snit, perhaps you could discuss this with him somewhere else (or
at least refrain from doing so here). We knew nothing of your
involvement in this before you started posting here.

I just do not want him spreading his lies about me here. He has a habit of
making up quotes about me and just out and out lying.

What is weird is that our two main debates were about nothing a reasonable
person would find offensive. One: I noted that in UI design there are well
established guidelines / principles and, two, I noted that I think there is
"too much" abuse of power by the police and others.
You may well have legitimate greivances with "cc", but unless
they're relevant to the C programming language, anyone here who's
interested in hearing about them can easily go to wherever this
dispute started and read about them there.

Frankly I have no idea why cc even brought the silly debate here. I
specifically asked him *not* to.

From COLA:

cc:
-----
Beyond that, I suggest you let everyone in clc know I was
dishonest, and for what reasons.
-----
Snit:
-----
Huh?  Why would I talk about you to other people.  My
goodness you think the world revolves around you.
-----
cc:
-----
Well if I was being lied to, I would like to know.
-----
Snit:
-----
Who said anyone was lying to you? And, frankly, why would
you care if you were lied to about some people you know
nothing about? But if you are feeling bad about your lying
then *you* should go confess. Why do you want me to do
your dirty work?
 
S

Snit

cc stated in post
(e-mail address removed) on 7/19/11
11:18 AM:
Interesting forgery. Seems kind of unnecessary though, no?

Based on the headers I think Carroll is trying to get in on the fun... but,
hey, given how you keep forging my name as you fabricate quotes and
attribute them to me, you should have no problem with him forging your name?

Right?

I mean, after all - you clearly have decided forgeries are OK in your book.
 
C

cc

cc stated in post

Based on the headers I think Carroll is trying to get in on the fun... but,
hey, given how you keep forging my name as you fabricate quotes and
attribute them to me, you should have no problem with him forging your name?

Right?

I mean, after all - you clearly have decided forgeries are OK in your book.

Ohhhh. I see what's going on here. Well I apologize to clc. Clearly
Snit has lost his little mind. I suggest killfiling anything from "cc"
as I imagine there will be more forgeries to follow from Snit. I asked
a C question (again, relatively speaking), and didn't intend on having
a moron follow and post off topic.
 
J

J. J. Farrell

cc said:
Snit said:
Keith Thompson stated in post (e-mail address removed) on
7/15/11 9:23 AM:
Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).
It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet
thing. That was supposed to be a joke for others reading, but one
person was very upset and called me a liar. So no, it wasn't "a
very good friend." Also, they seem to have an issue with the way I
presented the situation. Here is the full post:

"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'

Correct - but in reality what I actually do is exploit that to make
it intentionally fail!

e.g. I could easily write


if(SMALL) { do something } else { do something else }
[snip]

I don't recall anyone *here* taking issue with your characterization,
much less calling you a liar. If someone did so elsewhere, I fail
to see why you should bring it here, or why anyone here should care.
He insisted people would want to know about his mischaracterization
of his relationship with 7. I told him you folks likely would not -
and, at first, he wanted *me* to tell you about his
misrepresentation. I told him the whole idea was silly.

This really comes from a fairly long set of debates where cc has been
busted making all sorts of claims about me and others. He fabricates
quotes when he knows he has lost a Usenet debate. Some examples:

Poor Snit. You make up things about people all the time and then cry when
people do it to you.
But, whatever.

Exactly. He lied to you - but about something that does not really
matter. It is weird... why tell the lie at all? But then, once he
did, why make an issue out of it?

Makes sense.

I am not the one who even posted the technical question... and am not
a programmer (do some scripting, but not a programmer).

Good thought.

Would you two ask your nannies to push your prams of somewhere else, please.
 
S

Snit

J. J. Farrell stated in post [email protected] on 7/19/11 4:53 PM:
cc said:
Snit said:
Keith Thompson stated in post (e-mail address removed) on
7/15/11 9:23 AM:

Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).
It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet
thing. That was supposed to be a joke for others reading, but one
person was very upset and called me a liar. So no, it wasn't "a
very good friend." Also, they seem to have an issue with the way I
presented the situation. Here is the full post:

"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'

Correct - but in reality what I actually do is exploit that to make
it intentionally fail!

e.g. I could easily write


if(SMALL) { do something } else { do something else }
[snip]

I don't recall anyone *here* taking issue with your characterization,
much less calling you a liar. If someone did so elsewhere, I fail
to see why you should bring it here, or why anyone here should care.
He insisted people would want to know about his mischaracterization
of his relationship with 7. I told him you folks likely would not -
and, at first, he wanted *me* to tell you about his
misrepresentation. I told him the whole idea was silly.

This really comes from a fairly long set of debates where cc has been
busted making all sorts of claims about me and others. He fabricates
quotes when he knows he has lost a Usenet debate. Some examples:

Poor Snit. You make up things about people all the time and then cry when
people do it to you.
But, whatever.

But ok, by your own description you misstated your relationship with
the person who made the suggestion. You describe that misstatement
as a
joke, but you made it in a context where nobody would have any
reason to think it wasn't serious. I certainly had no reason to
doubt that the
person in question is "a very good friend" of yours (and very little
reason to care).
Exactly. He lied to you - but about something that does not really
matter. It is weird... why tell the lie at all? But then, once he
did, why make an issue out of it?

With regard to the technical issue, I'd say that

#define SMALL 1;

is almost certainly a very bad idea. There might be a valid reason
to reject the use of SMALL in some contexts, but that's not the way
to do it. As has been pointed out several times, code that refers to
SMALL assuming it's defined in a more conventional way could either
quietly work as intended, or quietly give incorrect results, or fail
with a syntax error (and a message that's likely to be misleading).

It would help to know exactly what contexts SMALL is *intended*
to be used in, and what contexts are to be discouraged.

I do not mean this to be a criticism of the person who made the
suggestion, since we don't know the context, and I don't particularly
trust your characterization of it.
Makes sense.

Apparently the person in question posts as "Snit", and has recently
posted in this thread. If Snit would care to discuss the technical
issues, I'd be interested.
I am not the one who even posted the technical question... and am not
a programmer (do some scripting, but not a programmer).

Perhaps there are good reasons for the macro that I haven't thought
of -- or perhaps the macro definition isn't as you've presented it.
But if it's going to devolve into an argument about who lied about
whom in some other forum, perhaps you could keep it in that other
forum rather than spreading it here.
Good thought.

Would you two ask your nannies to push your prams of somewhere else, please.

To be fair, this is not likely cc. Based on headers it is probably a guy
named Steve Carroll who follows me around wherever I go. He, like cc, loves
to forge people's identities.

In any case, sorry the COLA madness slipped over to CLC. I just did not
want cc to spread lies about me here that went unanswered. As if anyone in
CLC cares about me one whit. :)
 
N

Noob

Keith said:
Or, worse, if it's used like this:
voltage = LDO_MAX_VOLT + 1;

Setting the voltage to one VOLT more than the MAX?
Are you trying to blow the whole thing up?! ;-)
 
J

J. J. Farrell

Noob said:
Setting the voltage to one VOLT more than the MAX?
Are you trying to blow the whole thing up?! ;-)

The point is that this code would set the voltage to the max, NOT a volt
more. Perhaps this could be considered a good technique for safety
critical code ...
 
P

Phil Carmody

cc said:
That was his example. That was also his explanation of why he did it
(so the compiler would complain if he used it as an expression).

Another example was from the linux kernel.

/usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
#define LDO_MAX_VOLT 3300;

Patch now in Jiri Kosina's "trivial" tree. Commit 497888cf
treewide: fix potentially dangerous trailing ';' in #defined values/expressions

Thanks for reporting it (and for others such as Keith for responding
to it, as I don't read googlegroups posts). Let's hope we don't see
any more of those for a while.

Phil
 
M

Michael Angelo Ravera

Is it acceptable practice to have a #define with a semicolon in it,
such as:

#define SMALL 1;

I didn't think it was, but a very good friend of mine claims it's
perfectly acceptable if you want to prevent the #define from being
used in an expression like if(SMALL).

Maybe someday I will come up with a good and friendly reason to
include a terminal semicolon in a define, but every time that I have
included one to date has been accidental and ALWAYS extremely
unfriendly.

You haven't SEEN obscure and difficult to diagnose errors such as you
will see if you do this accidentally or use the define of someone who
did so.
 
T

Todd Carnes

Thanks for reporting it (and for others such as Keith for responding to
it, as I don't read googlegroups posts). Let's hope we don't see any
more of those for a while.

Posts to comp.lang.c are not GoogleGroups posts.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,085
Messages
2,570,597
Members
47,219
Latest member
Geraldine7

Latest Threads

Top