Are these the best set of bitset macros in the world or what!?

N

Nick Keighley

Oh, so you are in general a defamer and above the law? Look into it,
bitch. Shut up. Or say what you want to say. I'm not going to "sue" you.
Stop getting away with it. Get it off your mind (what you wanted to say
and not in a place with the threat of "law").

you are Edward Nil-ges and I claim the 20 UKP prize!
 
E

Ersek, Laszlo

It never occurs to you that anyone could disagree with you for some
reason other than being an idiot, does it?

*plonk*

(This week has been awesome, I've plonked four people over its course.)

lacos
 
W

wolfgang kern

"io_x" posted an old story about Big Endian issues to ALA:

<...>

X86 CPUs have it in the right order:
LSB is found on the LOWer address, which is more effective
for all calculations (bit(n) == 2^n and carry-over goes to
the logical consecutive next address).

Big Endianess may help displaying numeric figures in human
readable format, but need 'some' detours for ADD,MUL..DIV
when multibyte sized figures join in.

A relict from this times seem to have survived in now rare
used monochrome graphic modes found also on latest cards:
the top-left dot is at bit 7 of address byte 0, not very
convenient for programming and conversions on x86 anyway.

It's just that humans got numbers wrong written, perhaps
merchants(?) once decided to show their biggest first. :)

__
wolfgang
 
R

Robert Redelmeier

In alt.lang.asm wolfgang kern said:
"io_x" posted an old story about Big Endian issues to ALA:

X86 CPUs have it in the right order:
LSB is found on the LOWer address, which is more effective
for all calculations (bit(n) == 2^n and carry-over goes to
the logical consecutive next address).

Big Endianess may help displaying numeric figures in human
readable format, but need 'some' detours for ADD,MUL..DIV
when multibyte sized figures join in.

In fairness, big endianess is how human languages represent
numbers and lexicographic order. An alphabetic sort is much
easier bigendian. BCD was very important (COBOL).

As for humans reading dumps, one trick I do is print the
bytes right-to-left for hexdump and left to right for ASCII
with the address in the middle:

8086 0000 DEAD BEEF-2827 2625 2423 2221 00408000 ABCDEFGH........


-- Robert
 
B

Brian

Nick said:
We aren't going to agree here but I have seen cases where it really
mattered.

I know. Duh. But it is the exceptional case IMO. (Read, don't we all wish
we were ISVs like MS).
With embedded systems the system used for initial test is
often different from the final target system.

I wouldn't know.
Sometimes the target
just doesn't exist when software development starts.

Now that I understand. No need to queue everything. Duh, fundamentals.
Sometimes you
don't want to do do your first test run on real hardware (Anti-lock
brakes, fly by wire).

Simulation is fun! (no pun intended).
Sometimes it's just plain easier to debug a
version that actually has a screen, a keyboard and a hard disk.

You had a point? You are preaching to the choir? The group loves ya man.
(LOL).
I've seen non-portable code fail when the compiler was upgraded.

And I trimmed my toenails today.
In
theory a change in optimisation could trigger a changs in behaviour.

Are you high?
I've seen large applications ported between quite incompatible
platforms. This was mostly done by good encapsulation and required
some code changes. 90% + of the code didn't notice a thing.
Yawn.


A comms application ran on both a large unix box and a Z80 based bit
of embedded hardware. The embedded version was far less capable but
they shred a lot of code. Most of the debugging and testing could be
done on the unix box.

Who needs sleeping pills or the nightly world news program? Dude, you
have a niche! (hehe).
 
B

Brian

luserXtrog said:
Oh, so you are in general a defamer and above the law? Look into it,
bitch. Shut up. Or say what you want to say. I'm not going to "sue"
you. Stop getting away with it. Get it off your mind (what you
wanted to say and not in a place with the threat of "law").

I don't know which is funnier! Stadtler/Waldorf or Fozzie Bear?
FYI, ICYC, here is the entire message [Message-ID: <i6t0kv$gdr
[email protected]>] in which the
abbreviation occured. If you read all the words, you may find,
between two double-quote characters ('\"'), a phrase, the initials
of which match the abbreviation in question.

This rather drives home the point of avoiding unncessary layers
of abstraction.

You are high.
Eric said:
On 9/15/2010 11:06 PM, Brian wrote:
Barry Schwarz wrote:
[...]
And what happens if you want to test the sign bit? You might
avoid syntax errors and undefined behavior with
#define bit(n) (1u<< (n))

While I don't see the need, some might even want 1lu or 1llu.

OK on 1u. But the compiler wouldn't really change to signed if an
unsigned was passed (guaranteed with use of the bitsetX defines
below), would it?

You're *still* not seeing the problem, are you? All, right,
let's go through it step by step:

- What is the type of `1'?

I would assume it would be "coerced" to the type of n, yes?

No.

I knew that. Pointing out the design flaws of C is hardly difficult.
Since you don't know those rules, your opinion of them is
worthless.
Hehe.


You've demonstrated that you don't know C, that you're a novice
who makes novice errors -- and renders uninformed novice opinions.

I got your novice right here bitch.
No answer?

Is it done yet? What's the holdup? Pattering your pud instead of working.
Dock that man's pay by half. You are now on half pay.
No answer?

How many people do you know that answer to bittwits?
No answer? All these questions are too hard for you?

Bring it little boy.
(Side note: I was wrong here. It's not the annotations' phrase,
but the holy text of the Eighth Commandment itself.)


Years ago when I was young and stupid, my own BNITU was

#define until(x) while(!(x))

... but I soon outgrew that puerile stage.

You are proof that property taxes are a crime against humanity.
 
B

Brian

Shao said:
Sorry, it's a link with various and entertaining categories for
people. You said you were going to start categorizing people, so I
thought you might be entertained.

"who knew!". I suggest you keep posting, because you are totally apart
from the doldrum of the group. That said, I'm sure the sight is what you
said, but "I have ADD", and this ng is so much more fun.
 
L

luserXtrog

I don't know which is funnier! Stadtler/Waldorf or Fozzie Bear?
FYI, ICYC, here is the entire message [Message-ID: <i6t0kv$gdr
[email protected]>] in which the
abbreviation occured. If you read all the words, you may find,
between two double-quote characters ('\"'), a phrase, the initials
of which match the abbreviation in question.
This rather drives home the point of avoiding unncessary layers
of abstraction.

You are high.


That's low. Ad Hominem, anyone?

I failed to sufficiently attribute that the remainder of me message
was a quote. The author of the the portion below was, IIRC, Eric
Sosman.

But as I keep trying to say, I agree with everything you appear
to be about, except your words. I get the distinct impression
that you are not fnord reading all the words.

Eric Sosman wrote:
On 9/15/2010 11:06 PM, Brian wrote:
Barry Schwarz wrote:
[...]
And what happens if you want to test the sign bit?  You might
avoid syntax errors and undefined behavior with
     #define bit(n) (1u<<   (n))
While I don't see the need, some might even want 1lu or 1llu.
OK on 1u. But the compiler wouldn't really change to signed if an
unsigned was passed (guaranteed with use of the bitsetX defines
below), would it?
     You're *still* not seeing the problem, are you? All, right,
let's go through it step by step:
     - What is the type of `1'?
I would assume it would be "coerced" to the type of n, yes?
     No.

I knew that. Pointing out the design flaws of C is hardly difficult.

Since I'm here, I invite myself to comment.
I thank myself, I'm happy to oblige.

You appear to be making emotional judgments with undue rapidity.


Well may you laugh now, but he who laughs loudest cannot hear the
next joke.

I got your novice right here bitch.
..




Is it done yet? What's the holdup? Pattering your pud instead of working.
Dock that man's pay by half. You are now on half pay.

Now who's high?

How many people do you know that answer to bittwits?

Um, grammar check?

Bring it little boy.

I would appear to the heckling spectator that it has been brought.
Several times now.


These aren't the droids you're looking for.

You are proof that property taxes are a crime against humanity.

So is you me? or Eric Sosman?
 
W

wolfgang kern

Robert Redelmeier replied:
In alt.lang.asm wolfgang kern wrote in part:
In fairness, big endianess is how human languages represent
numbers and lexicographic order.

I found several variants of Big Endian:
* bit7 as LSBit but have bytes in Little_Endian (monochrome graphic)
* reversed ASCII-pairs in 16-bit words (Oem-strings in HD-identify)
* four byte upside-down storage (that's what BSWAP might be good for)
I assume we talk about the latter here.
An alphabetic sort is much easier bigendian.

Meant to sort Unicode characters on top ... ? :)
Ok the advantage is that 4/8/16 ASCII-characters can be compared at
once. This gain is easy lost in output and calculation routines.
BCD was very important (COBOL).

The two, yet rare used, x87 BCD-instructions work ten bytes in
Little Endian order.
As for humans reading dumps, one trick I do is print the
bytes right-to-left for hexdump and left to right for ASCII
with the address in the middle:
8086 0000 DEAD BEEF-2827 2625 2423 2221 00408000 ABCDEFGH........

I prefer and use the old classic (it's a single line) format:
[address|+ 0 1 2 3 4 5 6 7 8 9 a b c d e f| ASCII]
00408000 44 45 41 44 42 45 45 46-c0 22 23 34 35 41 42 43
DEADBEEF+"#45ABC

but I also added 16/32/64-bit pointer view to my hex-dump beside
a few options for text interpretation, ie:
[address|+ 0 4 8 c| ASCII]
00408000 44414544 46454542 342322c0 43424135 DEADBEEF."#45ABC

And because humans read also hex-numbers with a leftmost MSD, my
display routines work upside down and start at the end ;)
I use only one output routine for both, HEX and packed BCD.
__
wolfgang
 
B

Brian

luserXtrog said:
luserXtrog said:
Vincenzo Mercuri wrote:
Brian wrote:
I looked up BNITU on the web and came up with nothing. What does
that stand for?
Brian Now Is Trolling Usenet
Oh, so you are in general a defamer and above the law? Look into
it, bitch. Shut up. Or say what you want to say. I'm not going to
"sue" you. Stop getting away with it. Get it off your mind (what
you wanted to say and not in a place with the threat of "law").
I don't know which is funnier! Stadtler/Waldorf or Fozzie Bear?
FYI, ICYC, here is the entire message [Message-ID: <i6t0kv$gdr
[email protected]>] in which the
abbreviation occured. If you read all the words, you may find,
between two double-quote characters ('\"'), a phrase, the initials
of which match the abbreviation in question.
This rather drives home the point of avoiding unncessary layers
of abstraction.

You are high.


That's low. Ad Hominem, anyone?

I failed to sufficiently attribute that the remainder of me message
was a quote. The author of the the portion below was, IIRC, Eric
Sosman.

But as I keep trying to say, I agree with everything you appear
to be about, except your words. I get the distinct impression
that you are not fnord reading all the words
maybe you should stand down and get the **** out of my face.
 
M

Marcin Grzegorczyk

Brian said:
[...] I
AM trying to get my codebase to a place where I will be able to bring in
contractors to do the stuff I don't care to do (or that I won't live long
enough to figure out, it's all about time). Of course they will have to
be able to adapt to house standards and leave ISO at the door (along with
signing non-disclosure agreement and committing to "work done for hire").

The experience I've had so far coding in a team, as well as the
prevalent tone of responses to your initial post on this newsgroup, make
me predict that most of those contractors will not agree with your ideas
about "cleaning up the ugly C syntax" and will not use your macros
consistently. And then the code will become much more difficult to
understand and debug than it would have been without those macros. This
is a risk you'd better be aware about.
 
S

Seebs

Brian said:
[...] I
AM trying to get my codebase to a place where I will be able to bring in
contractors to do the stuff I don't care to do (or that I won't live long
enough to figure out, it's all about time). Of course they will have to
be able to adapt to house standards and leave ISO at the door (along with
signing non-disclosure agreement and committing to "work done for hire").
The experience I've had so far coding in a team, as well as the
prevalent tone of responses to your initial post on this newsgroup, make
me predict that most of those contractors will not agree with your ideas
about "cleaning up the ugly C syntax" and will not use your macros
consistently. And then the code will become much more difficult to
understand and debug than it would have been without those macros. This
is a risk you'd better be aware about.

I just want to know, if he's not willing to work with other people or reuse
anyone else's code, how he expects to make enough money to hire contractors?

BTW, those following along with my fascination with pathological narcissism
on the Internet may find it interesting to consider the implications of
"Brian"'s obsession with ways in which he can discard competing authorities
and have full ownership of and control of other people's time and creativity.

-s
 
P

Peter Nilsson

Eric Sosman said:
     Years ago when I was young and stupid, my own BNITU was

        #define until(x) while(!(x))

... but I soon outgrew that puerile stage.

Or perhaps social referencing and peer presure had an impact. ;)
I don't see that until() macro as puerile or an abuse of the
pre-processor.

One could argue <iso646.h> is puerile. My only criticism is
that they are macros in a header, not built in 'alternative
representations' as they are in C++.
 
E

Eric Sosman

Or perhaps social referencing and peer presure had an impact. ;)

In the fullness of time, I thought better of it myself. Truly.
I don't see that until() macro as puerile or an abuse of the
pre-processor.

One could argue<iso646.h> is puerile.

Seconded.
 
B

Ben Pfaff

Eric Sosman said:
Seconded.

Do you think that the alternate spellings in <iso646.h> are
puerile compared compared to trigraphs? According to the
Rationale, that is the reason that <iso646.h> exists:

It [AMD1] also adds a library header, <iso646.h>, that
defines a number of macros that expand to still other
tokens which are less readable when spelled with trigraphs.
 
E

Eric Sosman

Eric Sosman said:
Seconded.

Do you think that the alternate spellings in<iso646.h> are
puerile compared compared to trigraphs? According to the
Rationale, that is the reason that<iso646.h> exists:

It [AMD1] also adds a library header,<iso646.h>, that
defines a number of macros that expand to still other
tokens which are less readable when spelled with trigraphs.

It does not feel right to me -- it has never felt right to me --
that a language Standard should fret so mightily over the way the
source is encoded. This is a Standard, after all, that carefully
says as little as possible about the external form of the source
code: All we know is that it can be divided into "lines" (by some
unspecified means) and that there are ways to represent a certain
set of required characters (again, by unspecified means). And then
this same laissez-faire Standard turns around and niggles over ways
to re-encode characters and character combinations, on the grounds
that some I/O devices may have difficulty with them ... The caption
should be "What's wrong with this picture?"

Trigraphs, <iso646.h>, and digraphs all have the appearance of
barnacles whose only function is to weigh down the language and make
it lumpier. IMHO, C would be better off if all three had never been
invented -- but that's jut MHO.
 
N

Nick Keighley

Do you think that the alternate spellings in<iso646.h>  are
puerile compared compared to trigraphs?  According to the
Rationale, that is the reason that<iso646.h>  exists:
       It [AMD1] also adds a library header,<iso646.h>, that
       defines a number of macros that expand to still other
       tokens which are less readable when spelled with trigraphs.

     It does not feel right to me -- it has never felt right to me --
that a language Standard should fret so mightily over the way the
source is encoded.  This is a Standard, after all, that carefully
says as little as possible about the external form of the source
code: All we know is that it can be divided into "lines" (by some
unspecified means) and that there are ways to represent a certain
set of required characters (again, by unspecified means).  And then
this same laissez-faire Standard turns around and niggles over ways
to re-encode characters and character combinations, on the grounds
that some I/O devices may have difficulty with them ...  The caption
should be "What's wrong with this picture?"

     Trigraphs, <iso646.h>, and digraphs all have the appearance of
barnacles whose only function is to weigh down the language and make
it lumpier.  IMHO, C would be better off if all three had never been
invented -- but that's jut MHO.

its ok for those of with US (or near US) keyboards to look down our
noses at trigraphs and <iso646.h> but many europeans (and the rest of
the world!) have many important characters (to C programmers) missing
from their keyboards I believe the Danes in particular held out for
trigraphs. Of course life presumably became better with PCs where we
can map whatever we like to say the function keys.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,082
Messages
2,570,589
Members
47,212
Latest member
JaydenBail

Latest Threads

Top