S
spinoza1111
trim your posts?
On Nov 4, 2:18 am, Seebs <[email protected]> wrote:Oh, I'm all for elegance, regardless of who invented it, but -1 [for true]
isn't elegant.I don't think you know what elegance is. For starters elegance is
conceptual unity, [...] As it happened, [-1 for true] was best for me in my
limited time, simply to show that this conversion could be grammar-
driven, to use -1 as truthiness.
you could use a macro and diffuse all these arguments
#define TRUE -1
I quite like the grammer driven approach as an idea.
personnally I thought this was one of C's mistakes. They should have
had a boolean type (or alias) from day 1 and null should have been a a
language word. Oh, and function prototypes they should have been in
from the beginning. Lint? feugh!
I don't think there's any should about it. Various languages ahve made
various decisions. This is why the atcual values should be hidden
behind macros (or even language words!). We don't get our knickers in
a twist about the representaion of floating point values or character
sets (ok, we /do/; but we shouldn't).
Wow, I'll alert the media. 1==!0? New math indeed.
well == isn't very standard mathematical notation to start with. You
are aware that in C the ! operator yields 1 when applied to 0?
x !x
0 1
1 0
nz 0
where nz is any non-zero value. Hence !!x must yield either 0 or 1.
But of course Peter is begging the question.
You wouldn't say this if you had in the past to debug at machine code
level, because 1 looks too much like 0 to be a good way to represent
truthiness.
you're a hardware engineer, right? In my experience they're the people
who can't tell a 0 from a 1.
There are many [C] functions and APIs which have standardized on using negative
values to indicate *errors*. Because of this, readers who are experienced
with C will usually regard a "-1" as an error indicator. Your theory that it
would make sense for it to be regarded as a particularly idiomatic "true"
is fascinating, but it doesn't match the real world.
although this is a bit of a angels on the head of a pin discussion
Peter Seebach /is/ correct here. When I read Spinoza's code I was
expecting -1 to be a failure indication. Idioms and cultural norms are
important.
you can't have a structured walkthru in a bunch of usenet posts. Who's
the Co-ordinator?
I am, because nobody else here has the emotional maturity to be one
except me, and Ben Bacarisse, possibly.
You do actually. What alternative would you offer for "the people who
use a program"? If a user of my program says it is hard to use I
listen (I don't necessarily fix it but I do listen). If a random
person made comments he'd get less weight added to his comments.
I have, apocryphally, heard the comment "I don't like being called a
"user" it sounds like I take drugs".
why don't the end-users have a place? Who else can assess usability?
Users can't because they are plural, and often, the "user" is a
manager who has no idea of what the people who work for him, and use
software in the direct sense. The most "usable" software is designed
by programmers, not "usability engineers" (Craig's List being an
example), because "usability engineering" is a complete fraud that
makes people into objects.
You invoke "the user" as a
deus ex machina: [...] But literally, that's the person
who as you say needs to be innocent of the details!
the user interface and what the system does isn't a detail
This is just wrong. For one thing, this "user" has to know, not
whether the code is "correct" (for in fact all code is "incorrect" in
the absolute sense that a computer is ultimately a finite state
automaton) but its limitations, and in this case and many others, this
"user" (I prefer "caller")
caller and user are not synonymns
would rather read the code.
users don't typically read code. Though the user of an infix to
postfix library is not a very typical user...
IT HAS NO USER in this silly sense, of a fat man with money whose lack
of taste and conventionality is simply a way for ensuring that
programmers' insights are not reflected in the final product. This may
be all for the better, since most programmers don't have any insights
worth preserving (the lack of insight in most posts here is an
example): but neither, in our society, do their managers.
What matters is whether it meets its contract. This is the
documentation which states what the routine proposes to do.
[...] I had enough familiarity [...] to be able to describe the culture of C,
including its case sensitivity and use of ASCII in place of EBCDIC.
the C standard does not specify the use of ASCII
However, ascii dominates actual implementations.