How to convert Infix notation to postfix notation

S

spinoza1111

It's odd.  He seems to like my posting style.  The irony is that I got
it from you.  You once said that the only way to reply to spiniza1111
was to snip everything that was not topical and to respond only to
what was left.  I found it successful (yes I slipped once and talked
about grammars for maybe two messages).

Ben, don't use Heathfield as a role model. He is dishonest and a
bully. You're the better man. He is Flashman, and you are Tom Brown.
Anyway, I think a move to Faceook has been mooted.  I think that is an
excellent idea.

Shall we start at my Facebook site (Edward Nilges in Hong Kong?)

If that is acceptable, all who are interested should send me a
Facebook Friend request. If enough people reply from this group I will
notify this group that the discussion has moved.

Alternatively someone should a "fans of Edward Nilges" site for this
discussion, or a more neutral site such as "for the discussion of C
code".
 
I

Ian Collins

Richard said:
Oh, I keep forgetting about good old Sun cc. Do you think Nilges is
using that, then? I mean, really?

Well he could, it is a free compiler that runs on a free OS that can be
hosted on a free virtualisation solution!

Then he might realise there are case sensitive filesystems out there.
 
S

spinoza1111

Richard said:
Well he could, it is a free compiler that runs on a free OS that can be
hosted on a free virtualisation solution!

Then he might realise there are case sensitive filesystems out there.

I was the one to point out that there are case sensitive filesystems
"out there". When working on the case insensitive IBM 1401 mainframe
and reading about Algol circa 1973, I realized that IBM mainframes
were case sensitive but that other systems aren't. I have in fact
coded quite a lot of workarounds to this problem at multi-vendor
locales including Bell-Northern Research and Princeton.

Whereas Seebach could be uncharitably understood as not understanding
that in the early 1990s there were case insensitive file systems "out
there" whether this suits his autistic learning style or not, and
charging Schildt with an error because of Seebach's own ignorance.

Case sensitivity, versus insensitivity, is a convention. Neither is
arguably better as long as we all agree. Case sensitivity, it is true,
provides "more" file names and thus to some more "bang" for the
proverbial "buck" (note the Volksiche thinking), but it also provides
"more" opportunities for "errors", because it's poor practice in my
book to have two file names that differ only in case. Case
insensitivity reflects ordinary usage of nouns much more accurately.
Arguably the inventors of case sensitivity were less SENSITIVE to
words and less willing to use language clearly.

Note also that these inventors worked in non-IBM and non-Microsoft
installations often on the taxpayer's dime while being themselves
libertarian and right-wing in politics: while (for example) enjoying
all sorts of taxpayer largesse at places like Stanford Research
Institute, NASA, and so forth, these people also were middle class
homeowners. In California, they formed a key part of support for Prop
13, which has completely destroyed the country's finest public
education system.

Their technical decisions, such as the design of unix and C, show not
a little passive-aggressiveness. While for example C was designed as a
riposte to big-science, big-systems thinking, it also stole without
attribution key aspects of that thinking including the confusion
between gotchas and useful features. This mirrored in the smaller the
fact that in the larger, "libertarians" seek consistently to destroy
government support for people who aren't white males, but preserve a
technical elite, at all costs, and as welfare for white males, where
the oddities of C constitute a sort of jobs machine.

But case sensitivity works and provides certain opportunities such as
the ability to give a preprocessor macro and the corresponding
function the "same" name.

Please ignore Heathfield. He is a thug and a bully who sits on his fat
ass and engages third parties in conversation about the "mistakes" of
people with more technical education, scientific knowledge, and
general culture than he has where his conversation attempts to magnify
others' putative errors through dull repetition. He doesn't know how
to actually code C and appears instead to be some sort of failed
software manager, who uses managerial techniques of enabling the worst
in others to spread rumors and gossip. He has a track record of
emotionally manipulating people including Programmer Dude (Chris
Torek), Randy Thompson and now Peter Seebach by projecting what Lear,
in the Shakespeare play, calls "the great image of Authority", the dog
in office, and giving or withholding his approval, which is
meaningless.

The worst of it is that USA programmers have been so deprived of
autonomy by their employers that the skill and knowledge must be, in
their dreamlike world, in the apparatus and not in the man. This
creates the weird logic that infers from almost any "mistake" to "lack
of knowledge". Thus Herb made a vanishingly minor "mistake": Herb used
upper case characters in an #included file identifier.

I pointed out that this was understandable given case insensitivity as
opposed to the case sensitive world of unix C. This is obvious to
anyone whose downloaded, for example, Sun software to Windows even
today.

But in a nasty, dysfunctional, business office, which Heathfield is
simulating, to defend "the person who made a mistake while claiming to
know" is a transitive operation: the putative shame covers the person
who defends the first individual, so now the Official Story is that
Nilges doesn't know that unix systems are case sensitive, which he
does.

It is shameful that people should crawl in here from "programming"
jobs in which instead of programming they are trusted only to pass
"issues" on to some other drone...and instead of creating a space of
enlightened and fair discourse about real programming, should seek to
replicate their work situations, dominated as those work situations
are by non-programming thugs who deliberately set programmers against
each other so as not to be confronted with demands for economic
equity.
 
S

spinoza1111

In

spinoza1111wrote:

<over 250 lines of uncommented material snipped because Nilges is too
stupid or too malevolent to snip it himself - Hanlon's Razor applies>



That's your problem, not mine. Until you fix the problem, I'm going to
assume you're not interested in correct code and therefore I'm not
going to address the other problems in your code.


gcc, actually.


Of course they do.


As usual, you're wrong.


I can. I just thought it was too obvious to explain. Clearly it isn't
too obvious to explain, so I'll explain. You are passing string
literals to a function that takes char *. That's an inherently
dangerous situation, because you must not change string literals, but
a function that takes char * makes no promise not to change the
string pointed to. The correct fix is to make testCase() take const
char * for its first two parameters.

You fucking asshole, why do I have to wade through your libel for
this? OK, we KNOW that const makes code not design to change the
parameter simpler. It's handled better, jerk face, in C Sharp using
value and reference calling. You have contributed a genuine
improvement, but you've not identified a mistake. You've merely
reminded me of a facility which makes for better code and which on
some compilers generates warnings when not used. You are restored to
the conversation but I must ask you again to merely make technical
remarks if you want to be taken seriously.

You will be "Fat Bastard" in the Change Record for this and other
contributions.
That isn't known. They just haven't mentioned them.


Rubbish. It's because of your char. And this particular warning can be
overlooked on this particular occasion - you're accepting a char and
passing a char, and the only reason gcc is warning you is to do with
some relatively esoteric promotion rules that don't affect this
particular code, but the fact that it's harmless here is hard to
detect automatically so gcc is playing safe and telling you just in
case.

This correction is rejected, then, Fat Bastard.
And if the moon were made of green cheese, Neil Armstrong could poison
mice for the rest of his days.

I must ask you again to make only technical observations in this
thread.
void testCase(const char *Infix,
              const char *Expected,
              int MallocMax)

except that MallocMax should really be size_t, but that is a little
premature for now. I've removed the silly Hungarian because you said
"good programming practice". Removing silly Hungarian would certainly
be good programming practice for you.

No issue here. Please stop wasting your time or I will return you to
nonparticipant status.
 In most cases, the


No, it's not standard practice, it's stupid unwise unsafe bad
brain-damaged not-good practice. If you fix it, fine. If you don't,
there's no need to examine your code further.

Stop babbling. Stop repeating yourself to make your case. We're not
impressed.
Then you will have no difficulty in finding an example that will stand
up to independent investigation. Your failure to do so is therefore
significant.


That is a good indicator that your knowledge is flawed.


HA! You dare to say that? I'm seriously impressed, given your utter

HA! I DARE to say it, Fat Bastard!
failure to produce any evidence whatsoever to support any of your
wild claims.


No chance. Post here, read here.
Pussy.

Honesty works fine right here. If you want to set up a "lousy C coders
who are too stupid to learn C properly and too embarrassed to admit
mistakes" group on Facebook, feel free. But you probably want to find
a more deceitful group name than that, obviously, or people aren't
going to join.

Ben Bacarisse has said he thinks it's a good idea, but he may knuckle
under to your threat to damage his reputation online and deny this.
Actually, they might. Such is the world.


You'd better get right onto that, then - set yourself up as admin, so
that you have the power to gag or expel anyone who dares to tell you
the truth.

Damn right, asshole. It's better to do that than fill this thread with
spam.
If customised technical feedback is spamming, your topsy-turvy world
just got a little bit topsier and possibly a touch turvier as well.

Your signal to noise ratio is much too high, and you need to SHUT UP
about other people. You have no standing as a judge of technical
competence.
 
I

Ian Collins

spinoza1111 wrote:

Please ignore Heathfield. He is a thug and a bully who sits on his fat
ass....

How do you know he sits on a fat donkey, have you seen it?

<further ranting snipped>
 
N

Nick

Seebs said:
I don't think so... But as an interesting data point, I wrote a non-trivial
hunk of code using C99 features freely, and it works fine in gcc+glibc.
So apparently it's "good enough" these days.

I've taken that approach. I happily use C99 features that I like and
can make use of (// comments, VLAs from time to time (for workspaces in
functions mainly) snprintf and so on). I've taken the view that enough
compilers support these well enough, and they are valuable additions to
the language, that I'm prepared to accept the restrictions in
portability I get out of them. After all, at least they are clearly
documented in the standard.

I take the point about // comments being a pain when copied-and-pasted
to news.
 
N

Nick

spinoza1111 said:
Case sensitivity, versus insensitivity, is a convention. Neither is
arguably better as long as we all agree. Case sensitivity, it is true,
provides "more" file names and thus to some more "bang" for the
proverbial "buck" (note the Volksiche thinking), but it also provides
"more" opportunities for "errors", because it's poor practice in my
book to have two file names that differ only in case. Case
insensitivity reflects ordinary usage of nouns much more accurately.
Arguably the inventors of case sensitivity were less SENSITIVE to
words and less willing to use language clearly.

One of the problems with case insensitivity is what happens if you allow
non-ascii letters into file names?

Here's a directory:
/var/www/canal/source/temp$ ls
SS ß

Should a case insensitive system treat those two files the same?
 
S

spinoza1111

In <[email protected]>,

spinoza1111wrote:


When you post abusive material *and* a claim of libel in a single
sentence, it shows that you are truly desperate to avoid the issue.


I can't guarantee that I'm parsing your sentence correctly but, if my
interpretation is correct, it seems that you're making your usual
attempt to cover up your ignorance by pretending that you knew all
the time. It grows old after a while.




It wasn't a correction. It was an explanation of a diagnostic message,
which included the point that it is reasonable not to change the code
as a result. Learn to read.



I have neither the desire nor the power to damage anyone's reputation
online. Ben, although perfectly capable of damaging his own
reputation, has chosen instead to be a sensible chap. He thinks it's
a good idea to move the discussion to Facebook. I have no problem
with that.




Right, but it's not a problem I have a problem with.

<nonsense snipped>

--
Richard Heathfield <http://www.cpax.org.uk>
Email: -http://www. +rjh@
"Usenet is a strange place" - dmr 29 July 1999
Sig line vacant - apply within

This question is for the group including Fat Bastard (Richard
Heathfield), who has been readmitted owing to his const comment.
Seebach is also readmitted conditional on good behavior because of his
knowledge of C standards (not "the" standard).

Fat Bastard has pointed out that string parameters should be const
except where we propose to modify them. I agree because as is, given
universal by value call, these parameters could be accidentally
modified by inadvertent changes during development or maintenance. C
code needs belts, suspenders, condoms and parachutes galore because C
is an infantile disorder.

I therefore in the next release shall go through all function headers,
which are conveniently located as macro definitions at the front of
the code and located in one place thanks to the Nilges method of
procedure definition, and make all strings const save where the intent
is to modify them. Because of my scheme, all I have to do is select
all the code of the function header section, change char * to const
char *, and then review the changes to make sure I haven't changed a
function whose contract is to change the string.

But, for the reasons outlined below, I will first try to compile the
code to see if const generates a diagnostic as it should in MS C.

Let's be sure I know how this works.

Non const strings as formal and actual parameters result in passing
the address of the first char.

But how does const work? If the string is copied, this is an unbounded
performance and correctness hit. Is it worth it? I don't think the
string is copied. Instead, the compiler "just" prevents the string
from being modified.

Michael L Scott, in PROGRAMMING LANGUAGE PRAGMATICS (p 111) appears to
call const values "elaboration time" and says they must be on the
stack. It would appear to me that the compiler will have in all cases
to detect code that changes the string addressed on the stack and this
sounds hard in C given aliasing. Knowing C, or more precisely
recalling it from a former life as a memory of a Rimbaudian *enfer*,
wouldn't it be possible to circumvent in some evil way the intent of
const?

And if this is true why am I bothering with this language? Oh yes, to
show that "knowing C" is just, tautologously, "knowing C". It's like
being a law copier in Melville's office in Bartelby the Scrivener
(where some of the most "productive" law clerks, unlike Bartelby, were
actually illiterate) or a lawyer in Bleak House. It's like being a lab
tech, in itself. It's being Ygor, not the mad scientist.

Fat Bastard and Seebach in particular need in this discussion to
consider themselves lab techs like Ygor. Down, Ygor, down!

The open question is whether const actually prevents what Fat Bastard
thinks it prevents. I think it does but I can't figure out how.
 
S

spinoza1111

One of the problems with case insensitivity is what happens if you allow
non-ascii letters into file names?

Here's a directory:
/var/www/canal/source/temp$ ls
SS  ß

Should a case insensitive system treat those two files the same?

No, but note that BOTH case sensitivity, and case sensitivity, have
problems as regards internationalization!

Case sensitivity doesn't apply to languages without case. Languages
without letters (symbols for sounds) don't have case. Oops, wait a
minute. What about Chinese, used for writing by billions of people?

Case insensitivity, however, fails as well as in your example. More
generally it provides an intrinsically smaller universe of file and
variable names and now that we are all globalized and stuff, that's a
problem.

But note this. The issue, like many in C, is no longer a fit with the
modern world. When C was developed all computers used western symbols,
but today, Ugly Patch upon Ugly Patch has been needed to make C work
in the non-western world.

The question as to whether you're insensitive to case is meaningless
to a Chinese person and therefore by making it an issue you are
confusing things. The fact is that copies of Schildt were in heavy use
where I worked in Shenzen, where much of what he says is "wrong" in
the Chinese context.

More generally: the Chinese developers I worked with never said that a
book or other resource was wrong when owing to the language issue it
was literally wrong. The fact is that books from the West are priced
almost out of reach so that it's better to have a book even with
"errors" than to experiment blindly. Seebach has a strange, autistic
and almost retarded idea of what it means to write a book or teach if
he thinks it's appropriate to bully a teacher for errors. In so doing,
he's like the Hitler Youth or the students of the Chinese Cultural
Revolution and note that the Chinese wised up.

A confucian respect for teachers, in fact, involves respectfully
pointing out errors they make by asking questions. Seebach resembles a
Hitler Youth or "Elder Brother" of the Cultural Revolution, because in
both cases those thugs claimed that the teachers were "wrong" without
being able to teach themselves. The result in Germany was the
Holocaust, and the result in China was an entire generation of people
my age who were cheated out of an education!
 
S

Seebs

except where we propose to modify them. I agree because as is, given
universal by value call, these parameters could be accidentally
modified by inadvertent changes during development or maintenance. C
code needs belts, suspenders, condoms and parachutes galore because C
is an infantile disorder.

If you hate it so much, maybe you should just go use something else?
I therefore in the next release shall go through all function headers,
which are conveniently located as macro definitions at the front of
the code and located in one place thanks to the Nilges method of
procedure definition, and make all strings const save where the intent
is to modify them. Because of my scheme, all I have to do is select
all the code of the function header section, change char * to const
char *, and then review the changes to make sure I haven't changed a
function whose contract is to change the string.

Boy, that sure would be a lot easier if the function definition had the
contract right next to the code.
Non const strings as formal and actual parameters result in passing
the address of the first char.
But how does const work?
Woah.

If the string is copied, this is an unbounded
performance and correctness hit.

Not to mention a gaping, vast, violation of the spec.
Is it worth it? I don't think the
string is copied. Instead, the compiler "just" prevents the string
from being modified.

Why, yes.

There's no difference in representation, just in what you're allowed to
try to do using the pointer you get.
Michael L Scott, in PROGRAMMING LANGUAGE PRAGMATICS (p 111) appears to
call const values "elaboration time" and says they must be on the
stack.

I have no idea what he thinks he means by that. You might be looking at
a C++ book, perhaps, because "const" has extra semantics in C++ that it
lacks in C.

Mostly, though, it seems to me that you are not understanding the distinction
between the pointer and the pointee.

When you declare a "const char *p", you are declaring, not that p is a
constant, but that you cannot modify the things pointed to by p.
It would appear to me that the compiler will have in all cases
to detect code that changes the string addressed on the stack and this
sounds hard in C given aliasing.

You've been thrown for a loop again by your insistence on using this
"stack" concept where it doesn't apply.

What is passed as an argument is an address. Not the contents of the
string, but the address of the string. The address passed into the
function is, since C is a pass-by-value language, a "copy" -- of the
address. It points to the same set of characters that were available
elsewhere.

What changes, though, is that because that pointer has a const qualifier
on it, the compiler refuses to accept attempts to modify it.

So:

void truncate(const char *s) {
s[0] = '\0';
}

If you try to do this, the compiler will (assuming it's working) refuse,
because you're trying to modify something pointed to by s, but the function's
contract says it doesn't modify the stuff pointed to by s.
Knowing C, or more precisely
recalling it from a former life as a memory of a Rimbaudian *enfer*,
wouldn't it be possible to circumvent in some evil way the intent of
const?

Why, certainly.

But if you do, it's undefined behavior, and the compiler can whack you
for it.
And if this is true why am I bothering with this language?

No one can say.
The open question is whether const actually prevents what Fat Bastard
thinks it prevents. I think it does but I can't figure out how.

What it prevents is generating, without actively and intentionally
circumventing the language spec, code which modifies the string to which
you passed a const-qualified pointer. That means that, if you do this,
and then you accidentally write code which tries to modify an argument,
the compiler catches the possible error.

The relevance here is to a historical quirk of C, which I don't think anyone
denies is a wart:

* It's undefined behavior to modify the contents of a string literal.
* But they're not const-qualified.

Some compilers have a helpful option which allows you to request them to
treat string literals as though they were const-qualified. (They aren't,
normally, because a fair amount of existing code would break if they
were changed.)

So imagine that you have two functions:

int length_of_string(const char *s) {
char *end;
for (end = s; *end; ++end)
;
return end - s;
}

void truncate_string(char *s) {
s[0] = '\0';
}

If you use the "string literals are constant" option of a compiler, and
you write:

truncate_string("hello");

you'll get a diagnostic telling you that you can't do that, because
truncate_string can't be called with a const-qualified string. And that
will save you the trouble of wondering why, at runtime, things blew up
when you tried to modify an unmodifiable chunk of data.

No one disputes that this stuff is a bit wonky, and I suspect that if
C had been designed from scratch with "const" fully supported and well
understood, this would likely have gone differently -- most significantly,
string literals would always have been const qualified.

-s
 
F

Flash Gordon

Malcolm said:
Const isn't inherently a bad idea. The problem is that it can't simply be
tacked onto a language that was originally designed without it.

Consider, we have a bunch of string literals. We want to sort them. So we
write a little comparison function to qsort

int compstringsliteral(const void *e1, const void *e2)
{
char **s1 = e1;
char const * const * const s1 = e1;
char **s2 = e2;
char const * const * const s2 = e2;
return strcmp(*s1, *s2);
}

It works as long as C has fast and loose constness rules. Tighten things up
so that a const can never have its constness removed, and suddenly you've
tied yourself in knots with syntax and even function design.

In this case, only because YOU have made a mistake. Had you correctly
defined the pointers then, since strcmp takes pointers to const char, it
would work without problem.

I'll let you work out which of the instances of const can safely be
dropped ;-)

Having said that, I agree that const in C is not too good ecause, as you
say, it was bolted on rather than designed in from the begining. It's
just not as bad as you make out!
 
S

spinoza1111

If you hate it so much, maybe you should just go use something else?

Because I am demonstrating that C is like a bad slide rule: still
usable because of its triviality.
Boy, that sure would be a lot easier if the function definition had the
contract right next to the code.

No, it wouldn't. Because then I'd have to change it in the function
index at the beginning of the code.
Not to mention a gaping, vast, violation of the spec.
Ooooooooooo

Why, yes.

There's no difference in representation, just in what you're allowed to
try to do using the pointer you get.

But this can be overcome. We shall overcome. Aliasing means that you
can misuse "the pointer you get".
I have no idea what he thinks he means by that.  You might be looking at
a C++ book, perhaps, because "const" has extra semantics in C++ that it
lacks in C.

You need to do your homework before you make idiotic mistakes that are
far easier to correct online than C programming errors. You needed
here to look at the book on Amazon. It is not a "C++ book". It is a
comparative study of programming languages and p 111 is about C, not C+
+.
Mostly, though, it seems to me that you are not understanding the distinction
between the pointer and the pointee.

When you declare a "const char *p", you are declaring, not that p is a
constant, but that you cannot modify the things pointed to by p.

What's to prevent me? I do not believe that the compiler is fully able
to prevent me doing this. Because of aliasing the behavior of C code
cannot be determined in full from source.

#include "stdio.h"

void a(const char *p)
{
char *q = p;
*q = ' ';
}

int main()
{
char *p = "1"; a(p);
printf("%c\n", p);
return 0;
}

This code crashes at "*q = ' '" but the compiler fails to detect the
problem. Perhaps there is a compiler that prevents the assignment of
a const pointer to a non-const but in general the compiler cannot
detect all instances of this mistake.

You've been thrown for a loop again by your insistence on using this
"stack" concept where it doesn't apply.

No, you're ignorant of basic CS in which the stack is a basic
explanatory mechanism, because C runtime cannot be implemented without
a stack. Please do NOT step outside your competence.

Or perhaps you'd like to make a real fool of yourself and post a
Vicious Little Tirade about Michael L. Scott and PROGRAMMING LANGUAGE
PRAGMATICS? They'll have a laugh at Morgan Kauffman when they receive
your errata, going after Scott for daring to speak of stacks!

You appear to me to be unread. This is because it is COMMON PRACTICE
to speak of THE STACK as a GIVEN when talking about runtime, because
runtime languages are at a minimum of the Chomsky type that requires
the stack.
What is passed as an argument is an address.  Not the contents of the

Please don't waste our time with what we know.
string, but the address of the string.  The address passed into the
function is, since C is a pass-by-value language, a "copy" -- of the
address.  It points to the same set of characters that were available
elsewhere.

What changes, though, is that because that pointer has a const qualifier
on it, the compiler refuses to accept attempts to modify it.

This refusal is laughably easy to circumvent as I have shown.

Therefore it is folly to expect const to do your thinking for you, as
Fat Bastard seems to think it can.

Contrast this C Sharp code:

class Program
{
static void Main(string[] args)
{
char s = 'a';
a(s);
Console.WriteLine(s);
return;
}
static void a(char s)
{
s = ' ';
}
}

It prints 'a' safely because s is a value parameter and C Sharp
follows the same rule as C (one I think is not a good idea): value
parameters can be modified ON THE STACK (yes the stack). It is not
correct code, but it is safe code.

Correct code is Turing-impossible, but safe code isn't. C has no
concept of what it would be to be safe because the Von Neumann
structure of the machine is visible in C. There is no reason why this
should be the case except in low-level OS functions restricted in
kernel OS design to a small percentage of code. It is vanity,
incompetence and the desire to cover-up real mistakes that drives the
use of C elsewhere.

Managed C Sharp code is in fact provably checked by the compiler for
invalid references to memory as part of the ECMA standard. Your
"standard" normalizes deviance.
So:

        void truncate(const char *s) {
                s[0] = '\0';
        }

If you try to do this, the compiler will (assuming it's working) refuse,
because you're trying to modify something pointed to by s, but the function's
contract says it doesn't modify the stuff pointed to by s.

All I have to do to circumvent it is copy s into a non-const pointer.
OK, does your compiler catch this or not? Perhaps MS should but
doesn't.
Why, certainly.

But if you do, it's undefined behavior, and the compiler can whack you
for it.

Meaning you don't care to define it. Whereas C Sharp defines what
happens if you modify a value parameter, and in C Sharp copying a
string is a sensible and safe operation, not copying a pointer.

Part of maturity in CS and programming is becoming undeluded about
glamorous features than seem to dweebs, naifs and tyros to be "cool".
One of these is the "pointer" which is meaningful only in a von
Neumann architecture and in sensible thinking about computation needs
to be replaced by the concept of a "name".


No one can say.


What it prevents is generating, without actively and intentionally
circumventing the language spec, code which modifies the string to which
you passed a const-qualified pointer.  That means that, if you do this,
and then you accidentally write code which tries to modify an argument,
the compiler catches the possible error.

But not all such errors as I have shown you.
The relevance here is to a historical quirk of C, which I don't think anyone
denies is a wart:

The mistake (an American one) is to think that concern with removing
warts is somehow dispensable, secondary, feminine, but here's what
Dijkstra said:

Elegance is not a dispensable luxury but a factor that decides between
success and failure.
* It's undefined behavior to modify the contents of a string literal.
* But they're not const-qualified.

Some compilers have a helpful option which allows you to request them to

The existence of these options, which radically change the language at
compile time, means that C never was standardized in any meaningful
sense, is not standardized today, and never will be standardized.

These options seem to the untutored to be "power" but we have learned
that "power" is weakness when it makes it in fact impossible to safely
describe a language. Any C book will contain "errors" simply because
of C's "flexibility" and these options.
treat string literals as though they were const-qualified.  (They aren't,
normally, because a fair amount of existing code would break if they
were changed.)

So imagine that you have two functions:

        int length_of_string(const char *s) {
                char *end;
                for (end = s; *end; ++end)
                        ;
                return end - s;
        }

        void truncate_string(char *s) {
                s[0] = '\0';
        }

If you use the "string literals are constant" option of a compiler, and
you write:

        truncate_string("hello");

you'll get a diagnostic telling you that you can't do that, because
truncate_string can't be called with a const-qualified string.  And that
will save you the trouble of wondering why, at runtime, things blew up
when you tried to modify an unmodifiable chunk of data.

No one disputes that this stuff is a bit wonky, and I suspect that if
C had been designed from scratch with "const" fully supported and well
understood, this would likely have gone differently -- most significantly,
string literals would always have been const qualified.

"Knowing mistakes" is not science. It's a form of non-critical
descriptive sociology on a par with stamp collecting and porn.

"Celebrating mistakes" is therefore pretty sleazy and no occupation
for a gentleman.

"Destroying reputations based on the knowledge of mistakes" is
criminal.

Schildt's errors in describing "all possible variants of C", which
wasn't his goal, is nought beside the dishonesty of C programmers,
especially when they use this language except where absolutely
necessary. A managed C sharp program that compiles without errors or
warnings is guaranteed within a six-sigma probability not to cause
memory leaks (except in scenarios I shall describe). This guarantee
cannot be made of C.

It is TRUE of C Sharp that it contains methods that call Windows
systems functions when the library is available. And certain of these
functions will indeed cause memory leaks. For example, I can call MS-
DOS commands by way of a shell and these commands can call C programs
and erase hard disks.

But insofar as I do this, the package delivered cannot be ethically
described as pure .Net no-bullshit C Sharp. Instead it is a Windows
application...that cannot be ported to a non-Windows platform without
change.

But if we contrast C we find that what its mavens call "warts" are in
fact serious errors, so serious that C cannot be sensibly described
with the accuracy we can describe Java or C Sharp applications. It was
therefore vicious folly for you to single out Herb Schildt. The
problem is that C is a corrupted programming language and that you
need to retrain in object oriented languages instead of harassing
Schildt or me. You might get a real programming job if you do so.
 
S

spinoza1111

Const isn't inherently a bad idea. The problem is that it can't simply be
tacked onto a language that was originally designed without it.

When will C programmers realize that elegance isn't lipstick on a pig?
Consider, we have a bunch of string literals. We want to sort them. So we
write a little comparison function to qsort

int compstringsliteral(const void *e1, const void *e2)
{
   char **s1 = e1;
   char **s2 = e2;

   return strcmp(*s1, *s2);

}

It works as long as C has fast and loose constness rules. Tighten things up
so that a const can never have its constness removed, and suddenly you've
tied yourself in knots with syntax and even function design.

My point exactly. Fat Bastard thinks it's an "error" not to use
const...or, more probably he, like Peter Seebach, prefers destroying
reputations to coding great software, and therefore equates not using
a feature which doesn't accomplish much with breaking up Salvation
Army meetings or killing small and defenseless animals, not because he
cares about great software, but needs to pad an anti-resume: a
document that seeks to build his own shaky self-esteem and bad
reputation by destroying a better man.
 
M

Moi

On Sun, 08 Nov 2009 08:57:55 -0800, spinoza1111 wrote:

[snip]
#include "stdio.h"

void a(const char *p)
{
char *q = p;
*q = ' ';
}

int main()
{
char *p = "1"; a(p);
printf("%c\n", p);
return 0;
}

This code crashes at "*q = ' '" but the compiler fails to detect the
problem. Perhaps there is a compiler that prevents the assignment of a
const pointer to a non-const but in general the compiler cannot detect
all instances of this mistake.

Maybe you should change compiler.
gcc gives a warning even without the -Wall flag:

const.c:5: warning: initialization discards qualifiers from pointer target type


[bigsnip]

AvK
 
R

Robert Latest

Richard Heathfield wrote:

[stuff]

I'm back on c.l.c after a few years of Python-induced abstinence, but
now I'm writing a few tools that need to be run stand-alone on a pretty
bare-bones platform (Win32, incidentally).

Anyway, glad to see you're still here, but I start wondering out of which zoo
this spinoza knucklehead escaped.

All the best,
robert
 
N

Nick Keighley

SO, Mr Hofstadter - you have /two/ automatic sockpuppets! And you
managed to sneak them both onto the C99 committee, too!

can they eat their own dog food?
 
S

spinoza1111

Richard Heathfield wrote:

[stuff]

I'm back on c.l.c after a few years of Python-induced abstinence, but
now I'm writing a few tools that need to be run stand-alone on a pretty
bare-bones platform (Win32, incidentally).

Anyway, glad to see you're still here, but I start wondering out of which zoo
this spinoza knucklehead escaped.

Do your homework instead of reading one or two posts. Heathfield is
not a C authority nor an authority on programming. Instead, he's some
sort of manager who enables and spreads lies and gossip about
competent people which through repetition become the "truth" for the
Real Knuckleheads.

This is a discussion of a series of solutions to the problem of infix
to polish notation. Heathfield has made in fact a grand total of one
minor technical contribution to that discussion. This is not a
discussion of personalities. Unless you have something to contribute
to the topic, please leave.
 
S

spinoza1111

On Sun, 08 Nov 2009 08:57:55 -0800,spinoza1111wrote:

[snip]




#include "stdio.h"
void a(const char *p)
{
    char *q = p;
    *q = ' ';
}
int main()
{
    char *p = "1";      a(p);
    printf("%c\n", p);
    return 0;
}
This code crashes at "*q = ' '" but the compiler fails to detect the
problem.  Perhaps there is a compiler that prevents the assignment of a
const pointer to a non-const but in general the compiler cannot detect
all instances of this mistake.

Maybe you should change compiler.
gcc gives a warning even without the -Wall flag:

const.c:5: warning: initialization discards qualifiers from pointer target type

OK, some compilers generate a warning. In many cases those warnings
can be turned off. Whereas in Java and in .Net the status of a
noncompliant program is a single data point which is known after each
compile. You can still run the code, but both Java and .Net speak
"with one voice" about your code. It's "unmanaged" in .Net and this
property is a part of the language definition.

Whereas in C, the actual language in use is C(n) where n is some
number that is in general unknown.

I'm not saying that isn't a good warning, and I wish MS C had it: it
might. And certainly a workgroup can agree amongst itself to use a
fixed makefile in which options are standardized.

But note that in the usual case, the workgroup is "managed" by a
personality like Richard Heathfield, an enabler, who makes programmers
"productive" by forcing them to "compete" with each other. This
creates pressure to circumvent the agreed on standards at all times
because the job is making the manager look good.

C allows this process to be concealed: to some extent, Java and .Net
doesn't.
[bigsnip]

AvK- Hide quoted text -

- Show quoted text -
 
S

spinoza1111

      char const * const * const s1 = e1;>    char **s2 = e2;

      char const * const * const s2 = e2;





In this case, only because YOU have made a mistake. Had you correctly

Malcolm of course made no mistake, since he was presenting example
code in a discussion of how C handles a pathological case. However, it
seems that entirely too many posters here are recovering from
dysfunctional and lower middle class family systems in which a father,
himself treated like shit on the job, took pleasure in humiliating his
sons by pointing out various mistakes during their quality time.
Heathfield et al. replicate this father in a continual attempt to
transform examples of code submitted here in good faith into
"mistakes"...in an industry famous for real mistakes that are the
fault of management, and not programmers.
defined the pointers then, since strcmp takes pointers to const char, it
would work without problem.

I'll let you work out which of the instances of const can safely be
dropped ;-)

Having said that, I agree that const in C is not too good ecause, as you
say, it was bolted on rather than designed in from the begining. It's
just not as bad as you make out!

OK, so Fat Bastard (Richard Heathfield) is wrong again. He said it was
a mistake not to use it in calling tester() with literal strings: yet
it's almost useless and as Malcolm points out, Ritchie has spoken
against it.

If I used it thoroughly and as intended, I would append const to
NEARLY ALL of the formal parameters in the code under discussion,
since as a competent programmer I don't define a lot of formal
parameters meant to be modified: my functions are for the most part
tightly bound functions with read only inputs and one output returned
as the function value.

Nonetheless I shall do so. Note that because C is so inchoate a
language, actually a set of conflicting languages, this would be
"good" C only in some playbooks and not in others.

This is why it was unethical, and quite possibly actionable libel, for
Peter Seebach to enable the attacks on Schildt. The attacks presume
that there's only one proper C, which is something that programmers
whose skills are rusty and don't want to learn modern languages cling
to.
 
S

spinoza1111

In

spinoza1111wrote:


A diagnostic message, and the wit to read it.


The language provides tools to allow the programmer to circumvent the
rules if that's what he or she really wants, but makes no guarantees

I take issue with this childish use of language: "what the programmer
'wants'. What the user 'wants'". In applications programming,
correctness and maintainability are what's needed, not satisfaction of
desires.

The "credit crisis" alone reveals that what many "rocket science" C
programmers in finance "wanted" was to make their boss happy by
concealing toxic securities: some with no name and address of the
original debtor, and others linked to others in a loop.

Your sloppy language is a coverup of what's really going on: the use
of C as opposed to safe languages in business programming in order to
conceal fraud. I note that you have no record as an operating system
programmer or a programmer of compilers, where the use of C might make
sense.

I have to ask you: if you worked for a series of "banks and insurance
companies", what the HELL were you doing if you used such an unsafe
language to program financial instruments?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,077
Messages
2,570,566
Members
47,202
Latest member
misc.

Latest Threads

Top