How to convert Infix notation to postfix notation

S

spinoza1111

trim your posts?

On Nov 4, 2:18 am, Seebs <[email protected]> wrote:
Oh, I'm all for elegance, regardless of who invented it, but -1 [for true]
isn't elegant.
I don't think you know what elegance is. For starters elegance is
conceptual unity, [...] As it happened, [-1 for true]  was best for me in my
limited time, simply to show that this conversion could be grammar-
driven, to use -1 as truthiness.

you could use a macro and diffuse all these arguments

#define TRUE -1

I quite like the grammer driven approach as an idea.


personnally I thought this was one of C's mistakes. They should have
had a boolean type (or alias) from day 1 and null should have been a a
language word. Oh, and function prototypes they should have been in
from the beginning. Lint? feugh!

I don't think there's any should about it. Various languages ahve made
various decisions. This is why the atcual values should be hidden
behind macros (or even language words!). We don't get our knickers in
a twist about the representaion of floating point values or character
sets (ok, we /do/; but we shouldn't).
Wow, I'll alert the media. 1==!0? New math indeed.

well == isn't very standard mathematical notation to start with. You
are aware that in C the ! operator yields 1 when applied to 0?

x    !x
0    1
1    0
nz   0

where nz is any non-zero value. Hence !!x must yield either 0 or 1.
But of course Peter is begging the question.
You wouldn't say this if you had in the past to debug at machine code
level, because 1 looks too much like  0 to be a good way to represent
truthiness.

you're a hardware engineer, right? In my experience they're the people
who can't tell a 0 from a 1.

There are many [C] functions and APIs which have standardized on using negative
values to indicate *errors*.  Because of this, readers who are experienced
with C will usually regard a "-1" as an error indicator.  Your theory that it
would make sense for it to be regarded as a particularly idiomatic "true"
is fascinating, but it doesn't match the real world.

although this is a bit of a angels on the head of a pin discussion
Peter Seebach /is/ correct here. When I read Spinoza's code I was
expecting -1 to be a failure indication. Idioms and cultural norms are
important.


you can't have a structured walkthru in a bunch of usenet posts. Who's
the Co-ordinator?

I am, because nobody else here has the emotional maturity to be one
except me, and Ben Bacarisse, possibly.
You do actually. What alternative would you offer for "the people who
use a program"?  If a user of my program says it is hard to use I
listen (I don't necessarily fix it but I do listen). If a random
person made comments he'd get less weight added to his comments.

I have, apocryphally, heard the comment "I don't like being called a
"user" it sounds like I take drugs".


why don't the end-users have a place? Who else can assess usability?

Users can't because they are plural, and often, the "user" is a
manager who has no idea of what the people who work for him, and use
software in the direct sense. The most "usable" software is designed
by programmers, not "usability engineers" (Craig's List being an
example), because "usability engineering" is a complete fraud that
makes people into objects.
You invoke "the user" as a
deus ex machina: [...] But literally, that's the person
who as you say needs to be innocent of the details!

the user interface and what the system does isn't a detail
This is just wrong. For one thing, this "user" has to know, not
whether the code is "correct" (for in fact all code is "incorrect" in
the absolute sense that a computer is ultimately a finite state
automaton) but its limitations, and in this case and many others, this
"user" (I prefer "caller")

caller and user are not synonymns
would rather read the code.

users don't typically read code. Though the user of an infix to
postfix library is not a very typical user...

IT HAS NO USER in this silly sense, of a fat man with money whose lack
of taste and conventionality is simply a way for ensuring that
programmers' insights are not reflected in the final product. This may
be all for the better, since most programmers don't have any insights
worth preserving (the lack of insight in most posts here is an
example): but neither, in our society, do their managers.

What matters is whether it meets its contract. This is the
documentation which states what the routine proposes to do.
[...] I had enough familiarity [...] to be able to describe the culture of C,
including its case sensitivity and use of ASCII in place of EBCDIC.

the C standard does not specify the use of ASCII

However, ascii dominates actual implementations.
 
D

Dik T. Winter

> However, the distinction between declaration and definition, like
> sequence points, is not a reputable construct. It's a patch which
> corrects the blunders and limitations of the past.

In what way was the forward declaration in Pascal (which serves the same
purpose) the correction of a blunder and limitation of the past? And as
far as I know, the first Pascal compiler could not even read programs
on papertape, because the machine it was running on did not have a
papertape reader.
> You need to actually learn a radically different programming language.

Would also serve you well.
 
S

Seebs

I think what bothers you clowns is that working a few hours on my
commute, I can craft a solution that works (with commonsense clerical
changes on your system) in a language I don't like and which I haven't
used for almost twenty years, whereas you could not do this, not in a
million years, in an unfamiliar language. I'd dare ya to do the
problem in C Sharp: you can get C Sharp free. But you won't try
because you're incompetent.

No, I won't try because I have no interest at all in C Sharp. (And actually,
I don't think I can functionally get it free, because I don't have a Windows
machine on which to get it...) There are plenty of languages I don't know
in which I could presumably do this.

I love how you brag about how expert you were in C, how you taught it, and
so on, and then complain that it's an unfamiliar language, when the mistakes
you make would have been just as wrong twenty years ago.

-s
 
B

Ben Pfaff

spinoza1111 said:
This is a very strange coding style.

That is an unconstructive comment. 99.999% of code (esp C) doesn't
work and sucks stylistically. [...]

I don't see how that's an excuse for inventing a new style that
sucks :)
 
S

Seebs

He has a point actually. There's no real need these days for function
prototypes, for functions which are defined in the same file at least.

The language could probably be changed to special case functions defined
in the same translation unit, but it really would put the "marginal" back
in "marginal benefit".
How many development computers these days are only capable of running a
single-pass compiler?

Probably none. But rewriting existing compilers would be a fair bit of
hassle, and I'm not sure there's much benefit there. It's also trickier
to define the language semantics, because "has been seen or will be seen
later" is annoying.

Hmm.
The problem is having to maintain something defined in two different places:
the declaration, and the definition, which must match.

Agreed.

Which is why Spinny's solution is so damn funny to me -- he carefully
#defines the function declaration (sans ;), then uses it twice...

.... and writes it manually in the comment, saving a grand total of nothing.
Anyway, aren't IDEs/programming editors now sophisticated enough to take
care of creating prototype declarations?

I'd guess that they are. I don't think I've even consciously noted
interactions with prototypes in quite some time. Hmm. I dunno; it might
be sorta neat to have the compiler expected to look ahead, but...

I guess the problem is that, at that point, he'd just come back and yell
at us about how it's inconsistent to require prototypes some times and
not others. :)

-s
 
S

Seebs

- An extra pass is added to the translation, which 'gathers all prototypes'.
(Or some other way to remove the requirement that functions are declared
before use.)
Hmm.

- A new '#import' directive is recognized which scans the given (.c) file
for all function declarations.
Like this, existing code should still work, but you can then also have code
where you remove all declarations that aren't definitions, and then simply
#import all the .c files from which you're calling functions.

#import presumably has to follow #include and other #import statements,
should it recurse? Would there be #import-guards?

There's some serious room for crazy here, thinking about it. What happens
if I #import another file, but don't #include it, and it has some
#defines. Do I pick up those #defines?

I think I actually prefer separated headers to that, at least given some
of the historical quirks (like macros).

-s
 
S

Seebs

The person who knows them is better; they can use them because
they have them at hand to use in their mental armory whereas the
person who can't remember them does not.

What I've found is that people who know them all, but couldn't derive
them, are less able to use them than people who don't know them, but can
derive them -- because the ability to derive them implies a much deeper
understanding of what they're for and how they work.

Of course, someone who could derive them but doesn't have to has an edge,
but it's not usually a very large edge.

-s
 
K

Keith Thompson

Nick Keighley said:
when would my version break?

Seebs saved the the trouble of coming up with an example.

The point, of course, is that following an "always parenthesize"
rule for macro definitions is a lot easier (both for the author
and for anyone reading the code) than figure out the non-obvious
cases where you can get away without them.

My personal rules for this:

For any macro intended to be used in expression context, the
entire macro definition and each reference to an argument should
be parenthesized. The latter can be omitted if the expansion is
a single token. (This can be relaxed if the macro deliberately
subverts the syntax, something that is rarely a good idea.)

My favorite example (the following is valid, but *not* good code):

#include <stdio.h>

#define SIX 1+5
#define NINE 8+1

int main(void)
{
printf("%d * %d = %d\n", SIX, NINE, SIX * NINE);
return 0;
}
 
K

Kenny McCormack

No, no, he had a single sentence in there.


I think he just needs to get an AOL account so it'll be more comfortable
and familiar.

Not motivated by status at all, are we?

Nope, nope, nope.

Hint, hint: Everytime Heathfield/Seebs slams someone (which they do,
many times, in every single post), they are (obviously) being motivated
by status considerations. Heathfield, to his credit, doesn't bother to
deny it, but Seebs's repeated denials are getting tiring.
 
P

Phil Carmody

Keith Thompson said:
My favorite example (the following is valid, but *not* good code):

#include <stdio.h>

#define SIX 1+5
#define NINE 8+1

int main(void)
{
printf("%d * %d = %d\n", SIX, NINE, SIX * NINE);
return 0;
}

Truly marvelous - and new to me! Thanks for posting that, Keith.

Phil
 
P

Phil Carmody

Nick Keighley said:
there's nothing wrong with confining yourself to Window's based
computers. What is wrong is to extrapolate your experience to the
entire world.

As a raving *nix-alike bigot, I wish that they really would confine
themselves to Windows-based computers. Their packets would never get
beyond their own hub/router/modem, and certainly not propagate at all
far in the world at large.

Phil
 
B

bartc

Seebs said:
What I've found is that people who know them all, but couldn't derive
them, are less able to use them than people who don't know them, but can
derive them -- because the ability to derive them implies a much deeper
understanding of what they're for and how they work.

Of course, someone who could derive them but doesn't have to has an edge,
but it's not usually a very large edge.

I could never remember formulae like volumes of circles, but I could just
about derive them using bits of calculus.

Did that give me an edge? Er, no; other than that feat, my maths is
hopeless.

But I think there's a place for both types of people. (Asimov's short story
Profession has a good take on this.)
 
W

Willem

Seebs wrote:
) #import presumably has to follow #include and other #import statements,
) should it recurse? Would there be #import-guards?

It should not recurse, I think It should only grab the declarations from
the named file.

) There's some serious room for crazy here, thinking about it. What happens
) if I #import another file, but don't #include it, and it has some
) #defines. Do I pick up those #defines?

Well, no. All you get are the declarations, basically.
Like I said, you #import the .c file, not the .h file.

) I think I actually prefer separated headers to that, at least given some
) of the historical quirks (like macros).

You can still use #include for that. #import would be purely for those
things that are required to be able to link against the named file.

Although I see what you mean. Perhaps it could also pick up the variable,
struct and enum declarations (not static and in file-scope of course) ?


SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT
 
B

bartc

Willem said:
Seebs wrote:
) #import presumably has to follow #include and other #import statements,
) should it recurse? Would there be #import-guards?

It should not recurse, I think It should only grab the declarations from
the named file.

#import should be nestable, but smart enough not to #import the same file
again.
) There's some serious room for crazy here, thinking about it. What
happens
) if I #import another file, but don't #include it, and it has some
) #defines. Do I pick up those #defines?

Well, no. All you get are the declarations, basically.
Like I said, you #import the .c file, not the .h file.

) I think I actually prefer separated headers to that, at least given some
) of the historical quirks (like macros).

You can still use #include for that. #import would be purely for those
things that are required to be able to link against the named file.

Although I see what you mean. Perhaps it could also pick up the variable,
struct and enum declarations (not static and in file-scope of course) ?

The sort of entities you may want to share across files are function defs,
variables, #defines, enums, structs and typedefs.

But I don't think it is clear when looking at some of these, when scanning
the .c import file, whether they are local or global (ie. exported), because
there is no current C syntax to indicate the scope.

(I believe functions and variables are scoped, but the syntax doesn't make
it stand out: I think they are, by default, exported, unless marked static,
or some such rule.)

Also, when the full C source is not available (like libraries), #import may
have to work instead on a specially provided file containing only the
exports; although in this case a regular header can be used when working
with a library unaware of #import.

The other trouble is, typical C header files always seem to be such a mess
of macros, typedefs and conditional code that it would be difficult to
reconcile all that with the straightforward model of these things assumed by
the proposed #import statement.
 
S

Seebs

It should not recurse, I think It should only grab the declarations from
the named file.

What if the named file refers to types it got from another file?
Well, no. All you get are the declarations, basically.
Like I said, you #import the .c file, not the .h file.

And what if the declarations depend on the defines?
Although I see what you mean. Perhaps it could also pick up the variable,
struct and enum declarations (not static and in file-scope of course) ?

Yeah, but that starts getting finicky.

static enum foo { foo_a, foo_b };
extern int whoops(enum foo bar);

Obviously, we should pick up "whoops". What is its prototype?

-s
 
B

Ben Pfaff

Seebs said:
Yeah, but that starts getting finicky.

static enum foo { foo_a, foo_b };
extern int whoops(enum foo bar);

Obviously, we should pick up "whoops". What is its prototype?

It makes no sense to apply a storage-class-specifier such as
"static" to a type declaration.
 
S

spinoza1111

Not motivated by status at all, are you??

He's made a grand total of about one useful observation during this
code review. He's not as motivated by status as Heathfield, but he is
a major time-waster. Whereas every time I see Ben posting I know
there's raw meat: accurate bug reports and technical observations
lacking only in a certain maturity.
 
S

spinoza1111

The person who knows them is better; they can use them because
they have them at hand to use in their mental armory whereas the
person who can't remember them does not.

I think that Seebs had something there. I was at a talk by John Conway
(the inventor of Life and cellular automata) at Princeton and he
confessed that he forgets theorems and has to re-prove them when he
needs them as lemmas.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,078
Messages
2,570,570
Members
47,204
Latest member
MalorieSte

Latest Threads

Top