Sizes of Integer Types

F

Flash Gordon

pete wrote, On 13/09/07 23:43:
That's right.
It's defined in terms of "little or no modification".

ISO/IEC 2382-l : 1993

01.04.06
portability (of a program)
The capability of a program to be executed on various types
of data processing systems without converting the program
to a different language and with little or no modification.

Note that nowhere does it say *all* data processing systems. This is not
for pete, who obviously understands this, but for those who think that
you are throwing portability out the window if your SW won't run on all
implementations past, present and future.
 
F

Flash Gordon

Richard Heathfield wrote, On 14/09/07 01:32:
Flash Gordon said:



It shouldn't actually matter, should it?

It saves accusing people of thinking something that they don't think
sometimes :)
There is a tendency on Usenet (and, I think, in life in general) to
divide the world into teams, such that if I agree with X about Y, then
it is assumed by (some) others that all of my opinions about Y are the
same as X's opinions about Y. But this is simply not the case, and
there is no reason why it should be.

I often deal with people where I agree with them on some things but not
on others.
I agree with Kelsey that intN_t types are completely and utterly useless
(wait, wait!).

Well, there I disagree, as I'm sure you noticed. However, as shown below
you are being civilised about it and accepting that others may have good
reasons.
I also agree with Kelsey that it's no big deal quoting
from a draft in clc.

As do I. Especially the port C99 drafts which are effectively the
current standard because they are the published standard with the TCs
applied. So modulo errors in implementing the TCs they should accurately
represent the current standard.
But I am less bullish than Kelsey about other
people using intN_t. Personally, I think it's a mistake to be led down
that path, but hey, maybe - just *maybe* - there's a real use for these
little critters that I haven't thought of, yeah? And some people whose
opinion I respect have suggested that they do in fact find these types
useful. So, even though they seem completely useless *to me*, and even
though I think they are ugly and unnecessary warts, I'm not about to
insist that they're dropped from the language. If they are useful to my
friends, let them stay, warts and all.

A perfectly reasonable attitude. Actually, I would be happy if the
signed versions did not mandate two's complement.

Fixed width types are, IMHO, the perfect types for memory mapped HW.
Since then the HW defines it being N bits wide. I was doing this before
I was on this group and learned the correct way of doing lots of things,
so the code was not as portable as I could easily make it now, but doing
anything other than relying on a 16 bit integer type would have been a
step too far in terms of the work required to implement the system. It
*could* be done, but the extra effort would not be justified by the benefit.

I also see them as useful (but not as important) for interfaces which
will be writing binary data and the specification says the data will be
exactly N bits wide (the routine actually doing the read/write still has
to handle endianness, obviously.

I can also see why a lot of people would find absolutely no use for
fixed width types.

I also think that for many uses the fast types could be even better.
IMHO a lot of the time one wants a type with a minimum range but wants
it as fast as possible. So if you want a minimum of 32 bits and as fast
as possible long is not really expressing your requirements since it
*could* be 64 bits on a 32 bit architecture.
 
C

Charlie Gordon

Keith Thompson said:
If I recall correctly, some Cray vector machines have padding bits in
some of their predefined integer types. I haven't had a chance to
play with Cray's newer systems, so I don't know whether this is still
applicable; the systems I'm familiar with are mostly obsolete.

Cray's current generation systems (XT3) and next generation (XT4) use AMD 64
bit Opteron multi-core chips (up to 30 000 of them !).
 
C

Charlie Gordon

jacob navia said:
Well, this confirms the proposition that there are no

Alas no, it merely does not disprove the conjecture that there are none.

I am calling for examples, and this is not a valid one.
But surely the C99 folks must have had more than a few in mind when they
mulled the Standard.
 
R

Richard Heathfield

christian.bau said:
Lots of thanks for that link.

Now a quiz question: How is page 1 (that is the page after page xiv)
different from all the other pages?

In finest rec.puzzles tradition, I've included some "spoiler space", so
that those who would rather try to find the answer themselves can do so
by simply not reading this reply any further.


s
p
o
i
l
e
r
s
p
a
c
e
s
p
o
i
l
e
r
s
p
a
c
e s
p
o
i
l
e
r
s
p A N Y
a
c
e s
p
o
i
l
e
r
s
p M O M E N T
a
c
e s
p
o
i
l
e
r
s
p N O W
a
c
e

It doesn't contain a mis-spelling of "September".
 
P

Peter J. Holzer

jacob navia said:
Keith said:
"Al Balmer" <[email protected]> a écrit dans le message de
(e-mail address removed)...
]
Such biests are disappearing fast at this time,
They are?
Can you name counter examples ?

Aside from the DS9000 and some surviving Unisys mainframes, no one
came up with real life examples of contemporary architectures with
non twos-complement arithmetics, padding bits, trap values and
similar obsolete crap.
[...]
I am calling for examples, and this is not a valid one.
But surely the C99 folks must have had more than a few in mind when they
mulled the Standard.


Keep in mind that this was about 10 years ago. The Unisys mainframes you
mentioned above were still very much in use in the late '90s (the last
model was introduced in 1997), but most of them have probably been
replaced by now or will be when the first compilers for C09 appear, so
they probably won't influence the committee much now. (Speaking of C09:
Will there be a C09 and if so, what can we expect of it?)

hp
 
K

Kelsey Bjarnason

[snips]

from a draft in clc. But I am less bullish than Kelsey about other
people using intN_t. Personally, I think it's a mistake to be led down
that path, but hey, maybe - just *maybe* - there's a real use for these
little critters that I haven't thought of, yeah?

Yeah. Problem is, the closest anyone's come, thus far, to describing such
a situation is to save themselves writing a typedef and a #if block or
two, which, IMO, hardly constitutes an aching need to include new types in
the standard, unless those new types carry some other, significant benefit.

Based on what's been said, they are at best a "gee whiz" feature, and IMO
the language really doesn't benefit from gee whiz features. This may
explain, in part, not so much the lack of conforming compilers, but more
the lack of people screaming about the lack of conforming compilers.
 
K

Kelsey Bjarnason

[snips]

Fixed width types are, IMHO, the perfect types for memory mapped HW.
Since then the HW defines it being N bits wide.

Agreed. They are.
I also see them as useful (but not as important) for interfaces which
will be writing binary data and the specification says the data will be
exactly N bits wide (the routine actually doing the read/write still has
to handle endianness, obviously.

Again, agreed.

Here, however, is the question: before the new int types, were you
prevented from writing such code? Or did writing such code impose a
significant burden on you?

Let's emphasize that a bit: was the code of this sort, for the last 30
years or so, such a burden to write that it actually needed the standard
to be modified to include the new types? Bearing in mind the cost and
effort of actually writing new things into the standard?

So far the answer to that, based on what's been offered here, has been a
resounding "no", yet that's about the only justification thus offered for
these types.

Yes, there are benefits. Do those benefits outweigh the costs? Not as
far as anyone here has even begun to hint at.
 
K

Keith Thompson

Kelsey Bjarnason said:
[snips]
from a draft in clc. But I am less bullish than Kelsey about other
people using intN_t. Personally, I think it's a mistake to be led down
that path, but hey, maybe - just *maybe* - there's a real use for these
little critters that I haven't thought of, yeah?

Yeah. Problem is, the closest anyone's come, thus far, to describing such
a situation is to save themselves writing a typedef and a #if block or
two, which, IMO, hardly constitutes an aching need to include new types in
the standard, unless those new types carry some other, significant benefit.

<stdint.h> introduces no new types. It introduces typedefs, which are
aliases for existing types.

They exist for convenience. Yes, you can define your own typedefs if
you need them -- and your typedefs are likely to conflict with
somebody else's typedefs created for (nearly) the same purposes.

Another potential problem with creating your own typedefs is a lack of
generality. For example, someone (not necessarily you) might write:

#if defined __PLATFORM1__ || defined __platform2__
typedef unsigned int int32;
#elif defined _Platform3 || defined __PLATFORM4
typedef unsigned long int32;
#else
#error "Haven't tried this platform yet"
#endif

which breaks when you want to port the code to Platform 5, or when a
new release of Platform 4 makes unsigned long 64 bits wide.

Based on what's been said, they are at best a "gee whiz" feature, and IMO
the language really doesn't benefit from gee whiz features. This may
explain, in part, not so much the lack of conforming compilers, but more
the lack of people screaming about the lack of conforming compilers.

How many people here have said that they'd use the new typedefs in the
right circumstances?
 
K

Kelsey Bjarnason

[snips]

How many people here have said that they'd use the new typedefs in the
right circumstances?

So far, a couple who _also_ note they'd only do this on implementations
where they knew, ahead of time, exact-sized types of the right size
already exist (thus largely negating the utility of said types in the
first place) and myself (and perhaps another one or two) who have pointed
out that had such types actually been made useful, we'd use 'em.
 
I

Ian Collins

Kelsey said:
[snips]

How many people here have said that they'd use the new typedefs in the
right circumstances?

So far, a couple who _also_ note they'd only do this on implementations
where they knew, ahead of time, exact-sized types of the right size
already exist (thus largely negating the utility of said types in the
first place) and myself (and perhaps another one or two) who have pointed
out that had such types actually been made useful, we'd use 'em.

They can only be useful on a platform that supports them. Code that
requires fixed size types is also only useful on a platform that
supports them. The two go hand in hand.
 
P

Peter J. Holzer

Yeah. Problem is, the closest anyone's come, thus far, to describing such
a situation is to save themselves writing a typedef and a #if block or
two, which, IMO, hardly constitutes an aching need to include new types in
the standard, unless those new types carry some other, significant benefit.

The significant benefit is imho that these types actually document a few
aspects of their type.

If a programmer uses "int", what do you know that he expects?

* At least 16 bits?

* At least 32 bits? (non-portable, but not unreasonable)

* exactly 16 resp. 32 bits? (even less portable)

* No specific size but "large enough", on the assumption that the the
type will be larger on bigger/faster computers?

* the fastest integer type

* ...

If he uses "int_least32_t" you know that he wants the smallest integer
type with at least 32 bits. You could express that before with

#if SCHAR_MAX >= 2147483647
typedef signed char my_type;
#elif SHORT_MAX >= 2147483647
typedef short my_type;
#elif INT_MAX >= 2147483647
typedef int my_type;
#else
typedef long my_type
endif

But that's not very readable.

Also having a fixed, extensible scheme for defining integer sizes
removes the need for kludges like "long long". I do hope that when the
committee sees the need for standardizing a 128 bit type, they won't
introduce long long long, but simpy make int_least128_t and
int_fast128_t mandatory.

Based on what's been said, they are at best a "gee whiz" feature, and IMO
the language really doesn't benefit from gee whiz features. This may
explain, in part, not so much the lack of conforming compilers, but more
the lack of people screaming about the lack of conforming compilers.

I very much doubt that <stdint.h> is a significant obstacle for
achieving C99 conformance. It's just a bunch of typedefs and defines,
which can probably be written in an hour or so. So while it may not be a
huge leap forward in the evolution of C, it is very cheap to implement.

(Personally I liked the proposal to allow types like "signed least
int:18 x" for a variable which needed a range of at least +/- 100000 a
lot better. int_least18_t is unlikely to exist.)

hp
 
R

Richard

Peter J. Holzer said:
The significant benefit is imho that these types actually document a few
aspects of their type.

And with that you nail it on the head.

When I read something "like" (hypothetical example)

8BITS my8Bits;

I kind of get a *real* idea about the data structures/elements being
talked about. It seems to me that the sole reason (or 99% of the reason)
this group exists is for people to lecture people on not "assuming"
certain things about data element size. But guess what? In the real
world programmers do just that. And the person debugging and maintaining
that code can get a big leg up in the understanding stakes by having
that made explicit in the code.
If a programmer uses "int", what do you know that he expects?

* At least 16 bits?

* At least 32 bits? (non-portable, but not unreasonable)

* exactly 16 resp. 32 bits? (even less portable)

* No specific size but "large enough", on the assumption that the the
type will be larger on bigger/faster computers?

* the fastest integer type

* ...

If he uses "int_least32_t" you know that he wants the smallest integer
type with at least 32 bits. You could express that before with

#if SCHAR_MAX >= 2147483647
typedef signed char my_type;
#elif SHORT_MAX >= 2147483647
typedef short my_type;
#elif INT_MAX >= 2147483647
typedef int my_type;
#else
typedef long my_type
endif

But that's not very readable.

Also having a fixed, extensible scheme for defining integer sizes
removes the need for kludges like "long long". I do hope that when the
committee sees the need for standardizing a 128 bit type, they won't
introduce long long long, but simpy make int_least128_t and
int_fast128_t mandatory.

Based on what's been said, they are at best a "gee whiz" feature, and IMO
the language really doesn't benefit from gee whiz features. This may
explain, in part, not so much the lack of conforming compilers, but more
the lack of people screaming about the lack of conforming compilers.

I very much doubt that <stdint.h> is a significant obstacle for
achieving C99 conformance. It's just a bunch of typedefs and defines,
which can probably be written in an hour or so. So while it may not be a
huge leap forward in the evolution of C, it is very cheap to implement.

(Personally I liked the proposal to allow types like "signed least
int:18 x" for a variable which needed a range of at least +/- 100000 a
lot better. int_least18_t is unlikely to exist.)

hp
 
I

Ian Collins

Peter said:
The significant benefit is imho that these types actually document a few
aspects of their type.
Another I forgot to mention earlier is that they help disambiguate
legacy library functions, possibly one reason they are required by POSIX.

Where system libraries have a functions that previously assumed 32 bit
int or long (the networking library function htonl for example) only
make sense on a 64 bit platform when their parameters are redefined as
int32_t.
 
P

Philip Potter

Kelsey said:
[snips]

How many people here have said that they'd use the new typedefs in the
right circumstances?

So far, a couple who _also_ note they'd only do this on implementations
where they knew, ahead of time, exact-sized types of the right size
already exist (thus largely negating the utility of said types in the
first place)

If you believe that, then you misidentify the utility of the types.

The types have never been meant as a general fixed-width type available on any
platform. That is not their purpose, and the fact that they aren't available on
all platforms does not negate their utility because their utility was never
meant to be to provide fixed-width types portably to all platforms.

Phil
 
P

Philip Potter

Kelsey said:
[snips]

How many people here have said that they'd use the new typedefs in the
right circumstances?

So far, a couple who _also_ note they'd only do this on implementations
where they knew, ahead of time, exact-sized types of the right size
already exist (thus largely negating the utility of said types in the
first place)

If you believe that, then you misidentify the utility of the types.

The types have never been meant as a general fixed-width type available on any
platform. That is not their purpose, and the fact that they aren't available on
all platforms does not negate their utility because their utility was never
meant to be to provide fixed-width types portably to all platforms.
 
K

Kelsey Bjarnason

Kelsey said:
[snips]

How many people here have said that they'd use the new typedefs in the
right circumstances?

So far, a couple who _also_ note they'd only do this on implementations
where they knew, ahead of time, exact-sized types of the right size
already exist (thus largely negating the utility of said types in the
first place) and myself (and perhaps another one or two) who have pointed
out that had such types actually been made useful, we'd use 'em.

They can only be useful on a platform that supports them. Code that
requires fixed size types is also only useful on a platform that
supports them. The two go hand in hand.

Why is it so barking difficult for some folks to grasp that there is a
difference between "needs" and "would benefit from"?

Code which *needs* fixed-sized types may only work on implementations
where such sizes exist natively.

Code which could _benefit from_ such types could run anywhere such types
existed - even if not native to the system.

ISTR I've said this now about six times.
 
A

Al Balmer

Arithmetic can by relied upon more readily than representation.

My point is that computers are used to do other things than
arithmetic. Masking a 16-bit register to 8 bits does not make it an
8-bit register.
 
A

Al Balmer

You can't really do that portably. If you need bit addressability to your
hardware, you are beyond the scope of the C language.

Unless you have the stdint types. That's what we're talking about.
Can you name counter examples ?
There are many wonders in the world of embedded computing.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,961
Messages
2,570,131
Members
46,689
Latest member
liammiller

Latest Threads

Top