spinoza1111 said:
On Thu, 05 Nov 2009 20:52:43 -0800,spinoza1111wrote:
[snipped]
#define ADDFACTOR \
int addFactor(char *strInfix, \
char *strPolish, \
int *intPtrIndex, \
int intEnd, \
int intMaxPolishLength)
ADDFACTOR;
[snipped]
// --------------------------------------------------------------- //
Parse add factor
//
// int addFactor(char *strInfix,
// char *strPolish,
// int *intPtrIndex,
// int intEnd,
// int intMaxPolishLength) //
// addFactor = mulFactor [ *|/ mulFactor ] //
ADDFACTOR
{
char chrMulOp = ' ';
int intStartIndex = 0;
if (!mulFactor(strInfix,
strPolish,
intPtrIndex,
intEnd,
intMaxPolishLength))
[snipped]
The strange thing about this coding style is that it
will save you *no* keystrokes.
Since you actually duplicate the function prototype in the comment block
just above the function definition.
Whenever you have to change the function's arguments
you will have to change the (redundant) comment as well.
Yes. This, however, gives the best result for the program reader.
I disagree. In fact I think the more usual style helps the reader at
If you mean by "the more usual style" putting trivia before important
functions, I reject this. It's more important to me that main() not be
last than it return a value.
If on the other hand you mean a list of prototypes followed by the
functions in which the function prototype is re-coded, I reject this.
It is far, far worse than an extra comment, since mistakes in
correspondence cause the program not to even compile, or compile with
errors. Mistakes in the comment cause at worse a mistake in
apprehension which is correctable by changing the comment.
My way is clearly the best. The problem here is that like "Levine the
Genius Tailor" in Gerald Weinberg's Psychology of Computer
Programming, people have been so crippled by the defects of C that
they fetishize, reify and rationalize those defects, making a normal
human response to those defects (including coding great "pseudo-code"
which C cannot handle) itself a defect, and "normalizing" their own
deviance. Their being mentally crippled into people who write code
that is objectively disorganized (trivia before important things)
becomes in their own eyes a virtue.
As it is, C programs start not with a bang but a whimper. A program
should start with "front matter" including an announcement *sans
honte* which tells the reader what it does: but C programs in the real
world start in most cases with trivia which represents a waste of
time for the maintainer.
Programming languages were NOT invented to create a class of
normalized deviants who have discovered that their learning disorder
is overlooked or a benefit in some job. They were invented so that
people could say what they meant. Therefore a programming language in
which the central issues are what main returns or in which trivia must
precede important code is an abomination.
only a small cost to the writer. Your macros might help a casual
reader, but someone who is, say, reviewing the code or, more
importantly, tracking down a bug, needs to know the actual types, not
those in the comment. The comment might even lead a reader astray if
there is typo on the macro.
In that case he need only split the screen and view the macro
definition versus the code.
The global autism of programming, however, is sure to make for
programmers who, when they see code not laid out "their" way, will
declare without due diligence that their way is the only sensible way
of arranging code, no matter how deviant and that "everybody" does
things their fucked up way unless they are "incompetent", where
"incompetence" is actually the name of what Adorno called the secret
contour of their weakness.
This generates the abusrdity of appealing to the nonprogramming "user"
in discussing what is or is not readable code, for the nonprogramming
"user" doesn't read code...by definition.
The absurdity is generated because, as in the case of my coworker at
Princeton, the programmer feels himself in a closed system. There's no
way to "prove" apart from empirical sociological research which no one
programmer is qualified to do that predefining decls as macros is
"more readable", and it becomes his word against another's, so the
*deus ex machina* is invoked.
Since programming shops are staffed (either in reality or in the
*mythos* which creates the reality as is narrated by its participants)
by deviants who are by the terms of their employment subaltern, the
meaning of the symbol "user" is necessarily someone external to the
otherwise global capitalist system...an absent Lacanian phallus, an
absent father, or what Zizek calls "big Other".
But unlike Big Brother in Orwell's 1984, who's not an actual character
but is represented, not incarnated, by the midlevel Party functionary
O'Brien, the big Other today has to be incarnate in order to maintain
the illusion and the control. He's Donald Trump, or Rupert Murdoch.
He's the "father I never had", but also the person who says "you're
fired, for you have revealed the secret contour of your incompetence
and weakness to me". This persona is internalized but emerges as the
Decider in these sorts of "programming style" issues, which
programmers are simply not qualified to answer if they are the issue
of empirical, sociological reality.
Years ago, I gave a talk at the old "Data Processing Management
Association" in Peoria on the subject of how to simulate, with Go To,
the structured constructs of Bohm and Jacopini, and in that talk I
said that what most discourse neglected about the issue of
"readability and maintainability" was that it's silly to speak of
"readability and maintainability" without knowing much about the so-
called "user" (the reader of your code, who's not a user but another
programmer): brutally, if he won't read it it's unreadable as far as
we know and in this case.
I'd already seen people labor hard to comment and format their code,
only to be viciously attacked by software-ignorant managers for that
great crime against Holy Private Property, "wasting the company's
resources"...where the company's resources happen to be not JUST your
time but also what Marx called your very power to labor, which, if the
company is on its toes from the fiduciary point of view of the
majority stockholders, is taken completely...leaving the employee with
nary a jot or tittle of the time-resource to do anything outside of
his remit.
The remit is never quite defined, for that would give away the game,
which is power, and power is the destruction of other people's
autonomy.
This is why wounded spirits drag themselves in here not to be friendly
or to act in solidarity but to demonstrate that they are like Tony
Soprano, the chosen one who always pleases the dead Father. The
problem is that this dreamworld always demands the sacrifice of other
people's reputations to shore up the self-image of a person like
Heathfield and Seebach.
Heathfield is literally incapable of discussion abstract enough to
remove any imputations about the competence of other people, and
Seebach was unable to address the genuine issue of C portability
across the MS and unix great divide. For Seebach to do this would have
required him to become actually familiar with Microsoft systems but
unlike me he is unable to step outside his autistic comfort zone.
I'm willing to program in a language I dislike to prove my point, but
Seebach failed to research adequately why many of Herb's
recommendations work on Microsoft and not on unix. He preferred to
create a disorganized list of trivia which became the source for
nearly all the claims about Herb's competence because it flattered the
vanity of unix losers...I mean users.