F
Flash Gordon
Ian Collins wrote, On 13/09/07 07:38:
However, many 'programmers' assume int is exactly 32 bits and use it as
such thus making their code non-portable. So Chuck's argument could also
be used to argue that int should be removed from the language with
absolutely NO change apart from replacing "those fixed width types" with
"int". The same applies to lots of the rest of the language.
My use of such types would have been when I was doing embedded stuff as
well. The kit was all interfaced using a 16 bit serial bus (1553B if you
are interested) and the data we were interested in was all specified as
being sent as 16 bit scaled numbers which gives more than sufficient
range and precision (for example, even now screen resolutions are below
65536x65536). I'm sure it is easier to interface a 16 bit peripheral
chip (the 1553B interface chip) to a 16 bit processor than a 24 or 32
bit processor. You have power consumption AND heat dissipation limits to
meet, so you need to use "small" processors. Hence the chances of the
code ever being run on something that does NOT have a 16 bit integer
type are as near to none as makes no odds. However, being able to
compile/run some of the code which for simplicity requires a 16 bit
integer on a PC, Sun workstation or similar is useful and using uint16_t
simplifies that.
Agreed.
However, many 'programmers' assume int is exactly 32 bits and use it as
such thus making their code non-portable. So Chuck's argument could also
be used to argue that int should be removed from the language with
absolutely NO change apart from replacing "those fixed width types" with
"int". The same applies to lots of the rest of the language.
All of my C these says is either embedded or drivers, where size does
matter.
Over the past 20 odd years of embedded development I must have seen
dozens of different typedefs or defines for fixed width types, including
such abominations as WORD. So I was really pleased to see the language
standard define a standard naming convention that enables us to consign
this mishmash to history.
My use of such types would have been when I was doing embedded stuff as
well. The kit was all interfaced using a 16 bit serial bus (1553B if you
are interested) and the data we were interested in was all specified as
being sent as 16 bit scaled numbers which gives more than sufficient
range and precision (for example, even now screen resolutions are below
65536x65536). I'm sure it is easier to interface a 16 bit peripheral
chip (the 1553B interface chip) to a 16 bit processor than a 24 or 32
bit processor. You have power consumption AND heat dissipation limits to
meet, so you need to use "small" processors. Hence the chances of the
code ever being run on something that does NOT have a 16 bit integer
type are as near to none as makes no odds. However, being able to
compile/run some of the code which for simplicity requires a 16 bit
integer on a PC, Sun workstation or similar is useful and using uint16_t
simplifies that.