Syntax for union parameter

R

Rick C. Hodgin

Written out more than once. In the program I tried, it had them in the
source code of the test program. There are also used elsewhere so they
must be written out on at least one more place.

I wrote SHA-1 to be a stand-alone program if _TEST_ME is defined, and
an #include file if _TEST_ME is not defined. The reason you see the
duplication is for when the program is compiled as a stand-alone. I
did this because I offered up that code into the public domain, as I
received it. Were it part of my library, it would use the common
definition. The same is true for some other stand-alone utility
programs I wrote.
Yes. Someone who knows all that needs to be know to get the definitions
right. And they do that once for all the thousands of developers who
use the definitions.

Now I've installed some cross-compilers. And rather than simply running
xcc or ycc or zcc to get it to compile something with its built in
features, now I'm worrying about whether or not my include paths are
setup correctly for each, because xcc needs \xcc\include\ and ycc needs
\ycc\include\ and zcc needs \zcc\include\, and so on. The stdint.h in
the wrong place causes the code to fail. Wouldn't happen if the
definition were inside of xcc, ycc, and zcc natively.
If you think this is the same as just writing out your own wherever they
are needed, well, I must bite my tongue.

Chomp away.
Your estimation would be wrong, then.

Sorry to hear that.
You've never heard of Fortran, Pascal, C++, Ada, Haskell or Python?
Or did you just not know enough about them to know what the various
standard say about the fundamental types?

I consider C and C++ to be basically the same in this regard, so they
are already accounted for. I've never used Fortran. I've used Pascal,
but don't remember variable sizes (only used it in college because that's
what they offered). Never used Ada, Haskell, or Python (apart from
downloading programs which were already written which did something, and
then just running those, never editing).

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

If your compiler is installed anywhere near correctly you shouldn't need
to mess with the include paths for system files. #include <stdlib.h>
should be all you need to do.

If your compiler is not installed correctly all bets are off for any
language.

On my system I have Microsoft's C Compiler 6.0 installed, Visual C++, and
MinGW installed (GCC compiler toolchain on Windows). They are all installed
properly, they all have include directories, lib directories, and so on.

When I go to my C:\>

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

When I go to my C:\> prompt ... where are my environment variables pointing?
If I just type "gcc foo.c" where will it pull its include files from? Do I
now need to setup the include file locations with every compile line?

There are a lot of ducks to line up in a row to get C code to compile. I'm
going to try to change that (at the sacrifice of a little bit of speed).

Best regards,
Rick C. Hodgin
 
K

Keith Thompson

BartC said:
Why are they optional then? If I leave out stdint.h, I get: "error: unknown
type name 'int32_t'". Doesn't look like it's an essential, fundamental part
of the language!

And if you leave out <stdio.h>, you can't refer to type FILE. Some
features are defined in the core language, others in the standard
library. The details of which goes where are not necessarily always
entirely logical. A lot of the reasons are historical, including
a strong desire to avoid having new standards break old code.

I don't recall anyone claiming that C is the epitome of clean
language and library design. It isn't.

<stdint.h> is not optional for any implementation claiming
to conform to C99 or later. And it's not hard to provide
an equivalent for pre-C99 compilers; see, for example,
(http://www.quut.com/c/q8/index.html>. (Except that you can't
provide int64_t on pre-C99 compilers that don't have a 64-bit
integer type.)

As for int32_t being optional, that's just because the C standard
doesn't require all implementations to support 32-bit 2's-complement
integers. Systems that do support them must define int32_t --
and systems that don't can still have conforming C implementations.
I might use 'int32_t' for example, to guarantee a certain bitwidth, when
using 'int' would be just too vague. What's odd is then finding out that
int32_t is defined in terms of int anyway!

What's odd about that? It might be defined as int one compiler, and as
long int on another. The whole point is that you, as a programmer,
don't have to care how it's defined.
It's less common for essential primitive types to be defined in a standard
library.

The essential primitive types in C are char, short, int, long,
long long, et al. They happen to be of sizes that can vary from one
implementation to another. This kind of flexibility was absolutely
necessary when C was first defined; see the table of sizes near the
beginning of K&R1. Redefining the fundamental types in a later
standard would have broken existing code. Defining fixed-size
types in a new standard header was the most reasonable approach.
(It would have been nice if it had been done in C89 rather than in
C99, but it's a bit late to complain about that now.)
Not really. It's just another entry in a symbol table which happens to mean
the same as the equivalent int. However having them in the compiler *would*
simplify (1) the distribution by not needing stdint.h, and (2) a million
user programs which don't need to explicitly include stdint.h.

C implementations already must distribute a number of header files;
one more isn't that big a deal. Similarly, typical programs already
have multiple #include directives. The added cost is non-zero,
but IMHO trivial.
Except they wouldn't; they would use s32 or u32 (or i32 and u32 in my case).
'int32_t' et al are just too much visual clutter in a language which already
has plenty.

What exactly would you have wanted the C standard committee to do?

Would you have wanted the C standard to define {s,u}{8,16,32,64}
as keywords, while keeping the existing char, short, int, long,
and long long as predefined types? Would u8 and unsigned char be
different names for the same type, or distinct types? Would those
specific types be mandatory for all implementations? Would similar
types such as u9 be permitted?

And since existing code, written before the introduction of
<stdint.h>, would often define its own typedefs for fixed-size
types, how would you deal with existing code that defined those
names itself?

Adding the fixed-size types in a new standard header was the best
to avoid breaking existing code.

If you're defining your own new language, you have the luxury
of defining its type system as cleanly as you like. (Plenty of
such languages have been defined over the years; few have been as
successful as C.) If you're updating an existing language that's
been around since the 1970s, you need to work with what you've got.
 
K

Keith Thompson

Rick C. Hodgin said:
Duplicated in what way? With regards to stdint.h? Visual Studio 2008
doesn't come with stdint.h. And, from what I've read, even Visual
Studio 2010, which does come with stdint.h, uses u_int32_t, rather than
uint32_t, for unsigned, so a duplicate typedef is required for code
there as well. And, I'll admit that Microsoft's C99 support is lacking,
so no surprise there.

I don't know where you read that, but it's not true (as you presumably
could have checked for yourself). I just checked with my copy of
Microsoft Visual C++ 2010 Express (which includes a C compiler), and it
defines uint32_t, not u_int32_t, in <stdint.h>.

Are you unable to rely on having VS 2010 or later? If so, we can
offer advice on how to work around the shortcomings of VS 2008 (in
fact said:
Someone had to check the ones in stdint.h. And I would estimate also
that any self-respecting developer would check those. In fact, when
I run configure scripts on Linux source files to build some version of
an application, I almost always see "checking to see integer size" and
other similar messages during the build script.

Of course someone had to check the contents of stdint.h, namely the
authors of the implementation. Someone also had to check the rest
of the standard library, and that the compiler works correctly.
Implementers do this so you don't have to.

The GNU autotools (which is where most configure scripts come
from) are able to work with pre-C99, and probably even pre-ANSI,
People know these things vary like hairdos ... so they must always
test for them.

If you want to spend time checking that int32_t is really a 32-bit
signed integer type, go ahead. Personally, I don't bother; it's such an
unlikely problem that it's not worth my time to worry about it. If
int32_t were built into the core language (with any non-clunky name you
like), there would be no more or less reason to worry about it being
defined incorrectly.
I've never had another language where fundamental data types are of
variable size. From assembly through Java, they are a known size.
Only in the land of C, the home of the faster integer for those
crucial "for (i=0; i<10; i++)" loops, do we find them varying in
size.

I think Java was the *first* language I encountered (other than
assembly) that defined fixed sizes for the fundamental types.
And of course assembly isn't a single language; not just the operand
sizes, but the entire language, varies drastically from one system
to another. I've worked with plenty of languages that don't
(Pascal and Ada, for example).

Perhaps defining fixed-size 8, 16, 32, and 64-bit types is the wave
of the future. C did not originally define such types, but it has,
for all practical purposes, since 1999. (We know you don't like
how it was done; you can stop repeating that point if you like.)
 
B

Ben Bacarisse

Rick C. Hodgin said:
I wrote SHA-1 to be a stand-alone program if _TEST_ME is defined, and
an #include file if _TEST_ME is not defined. The reason you see the
duplication is for when the program is compiled as a stand-alone. I
did this because I offered up that code into the public domain, as I
received it. Were it part of my library, it would use the common
definition. The same is true for some other stand-alone utility
programs I wrote.

And with _TEST_ME (a reserved identifier) defined it does not compile
because it's missing the types. Someone else will have to type them in.
That's duplicated even if you, oddly, force others to do the
duplication.

I investigated a bit more and the originals from which you made your
copy used the standard types, so you at least knew of their existence
all along. Your contribution was to remove all reference to them and
substitute your own typedefs -- the ones that need duplicating. If your
compiler is too old have stdint.h you should simply have provided the
required definitions.
Now I've installed some cross-compilers. And rather than simply running
xcc or ycc or zcc to get it to compile something with its built in
features, now I'm worrying about whether or not my include paths are
setup correctly for each, because xcc needs \xcc\include\ and ycc needs
\ycc\include\ and zcc needs \zcc\include\, and so on. The stdint.h in
the wrong place causes the code to fail. Wouldn't happen if the
definition were inside of xcc, ycc, and zcc natively.

How do you know that? A compiler with built-in types might very well
still need files in the right place to work properly -- especially a
cross compiler. Anyway, the argument from a broken compiler install is
a very weak one. What will RDC do to prevent mall function when it's
not properly installed -- as a cross compiler? Straws and grasping come
to mind.
Chomp away.

I take it that means you don't contest the point. Or maybe you are just
getting bored with being right all the time?
Sorry to hear that.

You are sorry that I have said something wrong or you are sorry that I
said something right, or are you just sorry that I said anything at all?
If you don't want to comment on something, it's simpler just to cut it
from your reply.
I consider C and C++ to be basically the same in this regard, so they
are already accounted for.

OK, take C++ out and add, I don't know..., ML, Simula, Ruby, Eiffel
(and, for Robert, Algol). The exact number of cases is hardly the
issue. Your experience is rather limited and that means you should be
cautious about making claims about languages "from assembly through
Java" as if that were some wide range.

<snip>
 
R

Rick C. Hodgin

I don't know where you read that, but it's not true (as you presumably
could have checked for yourself). I just checked with my copy of
Microsoft Visual C++ 2010 Express (which includes a C compiler), and it
defines uint32_t, not u_int32_t, in <stdint.h>.

I do not use Visual Studio 2010 or later, but only Visual Studio 2008,
and Visual Studio 2003. I own personal copies of those applications,
and I am using them for all development. I would switch to VS2003
exclusively, but it is lacking the Code Preview Window that VS2008 has.

I read it online when I tried to include stdint.h and it wasn't found.
I searched for "Visual Studio stdint.h" and found that it was only
included in VS2010 and later, and that there were alternatives.
Someone on Stack Overflow (or other similar site) posted a question
about u_int32_t being defined, but not uint32_t. I did not test it,
and just assumed it was the case.
Are you unable to rely on having VS 2010 or later? If so, we can
offer advice on how to work around the shortcomings of VS 2008 (in
fact, we already have). If not, then you already have <stdint.h>.

I pointed out in a previous message the URL from someone who has made a
stdint.h designed for VS2008 and earlier. I do not plan on using it
as I already have my own typedefs in place.

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

And with _TEST_ME (a reserved identifier) defined it does not compile
because it's missing the types. Someone else will have to type them in.
That's duplicated even if you, oddly, force others to do the
duplication.

The #ifdef block after _TEST_ME defines those types. They are the types
for Microsoft's C/C++ compilers.
I investigated a bit more and the originals from which you made your
copy used the standard types, so you at least knew of their existence
all along. Your contribution was to remove all reference to them and
substitute your own typedefs -- the ones that need duplicating. If your
compiler is too old have stdint.h you should simply have provided the
required definitions.

I did not realize those types came from stdint.h, but did recognize they
were a particular size. I assumed someone was using a particular size
type on purpose. To keep it consistent with the rest of my code, I
converted those references to my naming conventions, and ran the tests.
How do you know that? A compiler with built-in types might very well
still need files in the right place to work properly -- especially a
cross compiler. Anyway, the argument from a broken compiler install is
a very weak one.

That isn't my argument. None of the compiler installs are broken. They
are all correct. However, because there are so many locations with the
different include files used for their version, forgetting to point to
the correct location could result in code that compiles correctly, yet
is errant in operation.
What will RDC do to prevent mall function when it's not properly
installed -- as a cross compiler? Straws and grasping come
to mind.

RDC uses a notably different ABI than other platforms. It is based on
the model I designed for Exodus. It will include everything it supports
as built-in, first-class citizens. When referenced in code there will
not be a need to include any header files, or anything else. What you
get with the standard libraries will be available, and upon use in source
code those code requirements will be included and linked during launch.
I take it that means you don't contest the point. Or maybe you are
just getting bored with being right all the time?

There are obvious advantages to using stdint.h, though I would still
define my own typedefs atop those clunky names so as to make them into
a usable form. But it's not required. I was able to accomplish the
same thing without knowing anything about C99 or the existence of
int32_t and so on. I came up with a natural conclusion because the
lack of such a feature in C is so overtly obvious. Knowing about
stdint.h would've saved me time, but not much over the years because
the time it took me to investigate what size each was could be
measured in minutes.
You are sorry that I have said something wrong or you are sorry that I
said something right, or are you just sorry that I said anything at all?
If you don't want to comment on something, it's simpler just to cut it
from your reply.

I am sorry to hear that developers would not take the time to check
something like that, at least the first time through, and especially
when migrating to a new system, or a new compiler.
OK, take C++ out and add, I don't know..., ML, Simula, Ruby, Eiffel
(and, for Robert, Algol). The exact number of cases is hardly the
issue. Your experience is rather limited and that means you should be
cautious about making claims about languages "from assembly through
Java" as if that were some wide range.

It is a wide range. Assembly is the machine level. Java is a virtual
machine, originally a fully interpreted language. The scope from the
hardware to the entirely software has fixed components.

My scope in computer languages is limited. I did nearly all of my
development in assembly, xbase, and C/C++, while also writing my own
compilers, interpreters, operating system, in assembly and C. I have
always had a particular vision in mind for what a language should be.
I came to that vision by examining the x86 CPU and asking myself the
question, "Knowing what I know about its design, what would a computer
language designed to run on that hardware need to look like?" And I
went from there. It's the same question I asked myself about how to
build my operating system. I didn't look at existing standards, or
current designs. I looked at the hardware, and then went up from there.

Over the years, and migrating into understanding Itanium later, and ARM
more recently, I am exceedingly glad to see that the design I had for
Exodus, and for my compiler, are exactly in line with what is required
at the machine level. It provides high level abilities as through the
C-like language, yet low-level abilities to not be so far away from the
machine to prevent many features which are hidden away today from it in
C source code, to be employed.

Time will tell. I am one many alone working on these projects. It will
be God Himself who allows me to complete them, as I also have a full-time
job, a two-hour per day commute, and family time which comes before my
work on Visual FreePro, the RDC compiler framework, the RDC programming
language, the IDE, debugger, and virtual machine. It's a lot of work for
one man. I keep asking people to help me, but because I am writing this
for Jesus Christ, and not to get rich or for some other non-Jesus Christ-
based reason, nobody is willing to help me. So, as I say, it will be God
alone who grants me the ability to complete this project.

It's interesting that about two hours ago I said this prayer to myself:
Dear God, I desire to complete these projects. I desire to complete the
work I have started. But beyond that, I desire to serve you with my life.
I would rather forgo everything I have started with these projects and
walk away knowing in my heart that I am serving you with my life, than
to pursue these projects knowing that I am walking away from you with my
life.

My heart is focused on serving God. And I desire it more than anything
else. Even more than completing this Village Freedom Project I began
back in July, 2012 with the current version of Visual FreePro I am
pursuing (virtual machine, RDC compiler framework, integrated IDE,
debugger, and plugin framework).

Best regards,
Rick C. Hodgin
 
G

Geoff

When I go to my C:\> prompt ... where are my environment variables pointing?

What? Where is your glorious IDE?

You shouldn't be in the root to compile or develop anything. You
should be in the document tree for the kit you are using. For Visual
Studio 2010 that tree is C:\Users\<Username>\Documents\Visual Studio
2010\Projects\<projectdir>.

I have news for you: VC 6.0 and MingW are not cross compilers they are
hosted environments.

VC 6.0 keeps its own environment variables separate from other MS
tools like the SDK and Visual Studio 2008 doesn't use environment
variables for its paths, they are in the IDE settings. The only EVs VS
20xx cares about are the ones pointing to the binaries and if you
install each version correctly, using the defaults, they are each in a
unique path. The only time you needed to worry about those paths was
when mixing VS 6.0 with the Platform SDK but that is pre-200x Visual
Studio.

If you are using Win7/64 you can't run VS 6.0 on it except in the
Windows XP VM and the EVs will be isolated there.
If I just type "gcc foo.c" where will it pull its include files from? Do I
now need to setup the include file locations with every compile line?

It pulls them from its own unique path. AFAIK there is no native GCC
for Windows, the MingW is a virtual environment with paths separate
from Windows and the MinGW window is not the cmd prompt window.
There are a lot of ducks to line up in a row to get C code to compile. I'm
going to try to change that (at the sacrifice of a little bit of speed).

Yawn.
 
I

Ian Collins

Rick said:
My scope in computer languages is limited. I did nearly all of my
development in assembly, xbase, and C/C++, while also writing my own
compilers, interpreters, operating system, in assembly and C. I have
always had a particular vision in mind for what a language should be.
I came to that vision by examining the x86 CPU and asking myself the
question, "Knowing what I know about its design, what would a computer
language designed to run on that hardware need to look like?" And I
went from there. It's the same question I asked myself about how to
build my operating system. I didn't look at existing standards, or
current designs. I looked at the hardware, and then went up from there.

Well there you go, your design goals are pretty much orthogonal to the
design goals of C and C++. Imagine who successful C would have been if
it had been designed to run exclusively on a PDP7?

The undeniable success of C comes from it not being constrained to any
particular hardware or fixed size data types.
 
R

Rick C. Hodgin

pointing?

What? Where is your glorious IDE?
LOL!

You shouldn't be in the root to compile or develop anything. You
should be in the document tree for the kit you are using. For Visual
Studio 2010 that tree is C:\Users\<Username>\Documents\Visual Studio
2010\Projects\<projectdir>.

LOL! LOL! And again I say LOL! :)
I have news for you: VC 6.0 and MingW are not cross compilers they
are hosted environments.

LOL! LOL! LOL!
VC 6.0 keeps its own environment variables separate from other MS
tools like the SDK and Visual Studio 2008 doesn't use environment
variables for its paths, they are in the IDE settings. The only EVs VS
20xx cares about are the ones pointing to the binaries and if you
install each version correctly, using the defaults, they are each in a
unique path. The only time you needed to worry about those paths was
when mixing VS 6.0 with the Platform SDK but that is pre-200x Visual
Studio.

That "whooshing" sound you heard when replying to my posts, Geoff, was
my point going right past you. :) LOL! Oh my goodness. Too funny. :)
If you are using Win7/64 you can't run VS 6.0 on it except in the
Windows XP VM and the EVs will be isolated there.

LOL! LOL! LOL! LOL! I'm dying over here. I'm dying. LOL! LOL!

FWIW, I run Windows 2000 Professional, or Windows Server 2003 for all of
my personal development, using Visual Studio 2003, or Visual Studio 2008.
I use Windows 7/64 and Visual Studio 2008 at my job.
It pulls them from its own unique path. AFAIK there is no native GCC
for Windows, the MingW is a virtual environment with paths separate
from Windows and the MinGW window is not the cmd prompt window.

That's not MinGW. That's CYGWIN. MinGW is a native GCC for Windows. I
run GCC from \mingw\bin\gcc.exe. If it loads some virtual environment
after loading gcc.exe (using it as some kind of Windows stub) ... that is
beyond my knowledge. However, I do invoke gcc.exe directly from the C:\>
prompt (no matter what actual directory I might be in), and it generates
code which runs in Windows itself (invoked from the C:\> prompt (no matter
what actual directory I might be in)). Again, if it uses that .exe as
some kind of stub to load a virtual environment ... that's a separate
issue.

I know with CYGWIN I've had issues in the past. It was the GCC tool I
always used on Windows. It had its own linux-like environment with bash,
and ran in its own virtual area. That is definitely not the case with
MinGW's GCC. I was able to execute those tools alongside Microsoft's
cl.exe compiler, and link the generated .obj files, as per the code I
posted in the Non-constant constant thread. I had to use the COFF
output format from GCC, which worked correctly with Microsoft's linker.

Oh ... I thank you for this post, Geoff. I haven't laughed that much
in a while. It is good to laugh like that. :)

Best regards,
Rick C. Hodgin
 
R

Rick C. Hodgin

Java has never been an interpreted language. The original
implementations of the JVM all strictly interpreted the byte codes

Byte codes are interpreted. They are not native operations and exist
inside the virtual machine program.
, as
opposed to more modern JVMs that just-in-time compile many of the byte
codes and execute then in that form (although mostly, if not all, JVMs
fall back on byte code interpretation in some cases). But the
language has always been compiled (to byte codes).
Yup.


Don't you think that learning what other have done before, and why,
and how well it worked (or didn't), might improve your ability to
design a new language?

No. I've tried to learn other languages at various times. It's why I
have not moved on to use other languages. They have things which, because
of my knowledge of assembly and how it is possible to compute data there,
are lacking, obtuse, or backward.

My goals have always been to bring forward a language which makes sense.
And I pray with RDC to do exactly that. :)

Best regards,
Rick C. Hodgin
 
D

David Brown

Why are they optional then? If I leave out stdint.h, I get: "error: unknown
type name 'int32_t'". Doesn't look like it's an essential, fundamental part
of the language!

If you leave out the definition of "main", you will also get errors when
compiling and linking. Does that mean "main" is not an essential part
of the language?

The types defined in <stdint.h> are like any other part of the
standards-defined library - the standard says what /must/ be
implemented, and what /may/ be implemented (with rules about how any
particular thing must work if it /is/ implemented). But it does not
mandate that the programmer /must/ use them! So just like in virtually
every other programming language, you need to declare that you are using
certain features, such as by writing #include <stdint.h>

I could well understand if you thought that it would be nice to have
certain common header files automatically. However, people would
disagree about which headers to consider special in this way, and it is
easier to be consistent if they are treated as any other header file.

I might use 'int32_t' for example, to guarantee a certain bitwidth, when
using 'int' would be just too vague. What's odd is then finding out that
int32_t is defined in terms of int anyway!

If you want a fixed bitwidth, then using int32_t is the best way to do
it. Why do you care how it is implemented? The implementation is,
It's less common for essential primitive types to be defined in a standard
library.

It is in fact very common for "essential" types to be part of the
standard library. It makes almost no difference to the language or its
use whether it is "int" or "int32_t" that is the "primitive" type, and
it is merely a matter of history that has determined that "int" is the
native, "primitive" type here.
Not really. It's just another entry in a symbol table which happens to mean
the same as the equivalent int. However having them in the compiler *would*
simplify (1) the distribution by not needing stdint.h, and (2) a million
user programs which don't need to explicitly include stdint.h.

<stdint.h> contains quite a number of different types and constants.

(1) Distributing an extra header file is far easier for the implementer,
and far more maintainable, than putting these types directly in the
compiler.

(2) A million user programs don't include <stdint.h>, because fixed size
types are not actually needed in most code. Some kinds of code make
heavy use of them (I use them a lot), but most *nix and Windows C code
does not need them. So having them in the compiler would mean
unnecessary entries in the compiler's symbol tables (native types need
to be there too) - not that it would make a measurable difference.
Except they wouldn't; they would use s32 or u32 (or i32 and u32 in my
case).
'int32_t' et al are just too much visual clutter in a language which
already
has plenty.

There are many ways to name your fixed size integers - a great many
people, including me, dislike such short names as "s32". I would
probably get used to them if they were the standard, but I don't like
them and would never have picked them.

I might have picked "int32", "uint32", etc., without the "_t" suffix.
But I certainly don't mind it or find it "visual clutter". It's a
matter of personal preference, of course.
 
R

Rick C. Hodgin

Well there you go, your design goals are pretty much orthogonal to the
design goals of C and C++. Imagine who successful C would have been if
it had been designed to run exclusively on a PDP7?

I cannot speak to that point since it didn't happen that way. I can speak
to the point that the C language itself is very expressive of low-level
abilities in a high-enough-level of understanding to make it usable by
people. I think that feature in and of itself would've allowed it to
cross over to other CPUs, even if the standards were indicative of some
particular architectures which may not be like the others. I think we
would've simply had D a lot sooner, where people modified those standards
to meet the needs of other CPUs.

Maybe I should call my language E.
The undeniable success of C comes from it not being constrained to any
particular hardware or fixed size data types.

Perhaps. We will never know if that is true because we cannot sample
data from the other timelines. :)

Best regards,
Rick C. Hodgin
 
I

Ian Collins

Rick said:
I cannot speak to that point since it didn't happen that way. I can speak
to the point that the C language itself is very expressive of low-level
abilities in a high-enough-level of understanding to make it usable by
people. I think that feature in and of itself would've allowed it to
cross over to other CPUs, even if the standards were indicative of some
particular architectures which may not be like the others.

Well how about a specific example?

You are keen specify calling conventions, so consider what happens when
you base that specification on your knowledge of a relatively register
poor architecture (x86) and impose it on a register rich architecture
(SPARC or AMD64). Neither of those push to the stack if there are
enough registers to hold parameters (which in the case of SPARC, can be
a large number). Even within the small x86/AMD64 family, optimal
calling conventions are significantly different. If you want to see for
yourself, compile

#include <stdio.h>

int f( int, int );

int main(void)
{
printf("%d\n", f(3,4) );
return 0;
}

For 32 and 64 bit and compare the assembly.
I think we
would've simply had D a lot sooner, where people modified those standards
to meet the needs of other CPUs.

In other words, fragmentation of the language.
 
D

David Brown

The #ifdef block after _TEST_ME defines those types. They are the types
for Microsoft's C/C++ compilers.


I did not realize those types came from stdint.h, but did recognize they
were a particular size. I assumed someone was using a particular size
type on purpose. To keep it consistent with the rest of my code, I
converted those references to my naming conventions, and ran the tests.

You really should take some courses on basic C, or stick to using tools
that you understand (if there are any, other than the vapourware RDC).
Taking perfectly good code, and then mangling it because you don't know
C, are not using a C compiler, and are incapable of even doing a simple
google on "int32_t", is not good development work.
That isn't my argument. None of the compiler installs are broken. They
are all correct. However, because there are so many locations with the
different include files used for their version, forgetting to point to
the correct location could result in code that compiles correctly, yet
is errant in operation.

/You/ don't have to do any pointing (unless you want to do something
weird - and it is certainly possible to use different libraries with any
given compiler. But I think such advanced usage is out of your depth).
Based on correct installation, the /compiler/ knows where all the
correct include files and libraries are.

I don't know about MSVC++, but for gcc, the compiler, the libraries, the
headers, and usually also tools such as the assembler, linker,
librarian, etc., are all installed in a single tree. It is really easy
to find the right include paths based on the gcc compiler binary. In
fact, I don't believe it could possibly get any easier.

Rick has been clutching at straws since this thread began.
RDC uses a notably different ABI than other platforms. It is based on
the model I designed for Exodus. It will include everything it supports
as built-in, first-class citizens. When referenced in code there will
not be a need to include any header files, or anything else. What you
get with the standard libraries will be available, and upon use in source
code those code requirements will be included and linked during launch.


There are obvious advantages to using stdint.h, though I would still
define my own typedefs atop those clunky names so as to make them into
a usable form. But it's not required. I was able to accomplish the
same thing without knowing anything about C99 or the existence of
int32_t and so on. I came up with a natural conclusion because the
lack of such a feature in C is so overtly obvious. Knowing about
stdint.h would've saved me time, but not much over the years because
the time it took me to investigate what size each was could be
measured in minutes.

There was a time, long ago, when C lacked a convenient way to get
implementation-independent fixed size integers. The powers that be, the
C standards committee, realised that this was something missing and that
people were making their own fixed size typedefs. So as this was a
useful feature, they standardised it for C99. They did so in a way that
was consistent with C historically, with existing C code, and with the
way programmers expect C to work. Apart from totally subjective
opinions about the exact choice of name, they did a perfectly good job.
I am sorry to hear that developers would not take the time to check
something like that, at least the first time through, and especially
when migrating to a new system, or a new compiler.

<stdint.h> is part of the implementation. When I use a new compiler
(and I guarantee you that I have used /many/ more than you have), I rely
on the <stdint.h> type sizes in the same way, and for the same reasons,
as I rely on the compiler generating correct code for "x = y + z;".

There are certainly things I /do/ check with compilers - even things
that are clearly specified in the standards, because compiler
implementers are mere humans and mistakes can happen. But I don't think
any C implementer will have trouble getting the sized integers right -
after all, even /you/ managed to do it in a few minutes.
It is a wide range. Assembly is the machine level. Java is a virtual
machine, originally a fully interpreted language. The scope from the
hardware to the entirely software has fixed components.

It is a /tiny/ range of programming languages. The fact that you even
imagine that Assembly, C, xbase, and a smattering of Java is a "wide
range" shows how ignorant you are.
 
D

David Brown

While that is good to do, I would argue that it really shouldn't be the
only documentation of the limits.

Most programs have documentation, and in that documentation should be a
requirements section, and these requirements should be listed there.
This warns of the issue before you get to actually compiling code.

Fair enough. I just know that in practice, code maintenance is often
done badly, and documentation maintenance is done far worse. So I put
the emphasis in having such assumptions "documented" in source code, and
causing compile-time failures if they are broken. I agree that the
separate documentation should include these assumptions too, but I the
code is always the ultimate documentation.
 
B

Ben Bacarisse

Rick C. Hodgin said:
The #ifdef block after _TEST_ME defines those types. They are the types
for Microsoft's C/C++ compilers.

Sorry, I meant *not* defined. (I doesn't compiler with _TEST_ME defined
either, but that is because of other problems no related to the types.)

I am sorry to hear that developers would not take the time to check
something like that, at least the first time through, and especially
when migrating to a new system, or a new compiler.

Thanks. Much clearer. Surely if your code needs a known-width type, it
will fail its very first test. I've never found any value in testing
the implementation -- I test my code instead.
It is a wide range. Assembly is the machine level. Java is a virtual
machine, originally a fully interpreted language. The scope from the
hardware to the entirely software has fixed components.

So, what are the languages in this wide range which do as you expect?
None of the ones you listed as knowing fit the bill.

<snip>
 
B

BartC

Keith Thompson said:
What exactly would you have wanted the C standard committee to do?

Would you have wanted the C standard to define {s,u}{8,16,32,64}
as keywords, while keeping the existing char, short, int, long,
and long long as predefined types?

Why not? That range is pretty much universal now, and everyone seems to use
their own private naming schemes anyway which can itself lead to problems.
If it was all built-in then they wouldn't arise.

(I've just been sorting a problem with my '#define byte unsigned char' which
somehow clashes with a 'byte' type in windows.h. If I change my definition
to use typedef, then it goes away, but only because C seems to allow the
same typedef twice. The point however is if all these little types were
built-in to the compiler, the problem wouldn't have arisen.)
Would similar
types such as u9 be permitted?

Probably not (but see below).
And since existing code, written before the introduction of
<stdint.h>, would often define its own typedefs for fixed-size
types, how would you deal with existing code that defined those
names itself?

Adding the fixed-size types in a new standard header was the best
to avoid breaking existing code.

If you're defining your own new language, you have the luxury
of defining its type system as cleanly as you like. (Plenty of
such languages have been defined over the years; few have been as
successful as C.) If you're updating an existing language that's
been around since the 1970s, you need to work with what you've got.

I'm working on a language front-end for C, where the types have to be
compatible. In this front-end, an exact 32-bit signed integer type can be
represented as any of i32, int32, int:32 or int*4 (and written out in C as
i32, which is an alias for int32_t).

Any of these would have been fine choices to adopt in C itself (in fact
something like int:32 is already used for bitfields, and this form can allow
9-bit types to be defined too, although I prohibit it).

It's not hard even for a language that has been around a long time, nor is
it difficult to add some sort of language version control (there are already
options for C99 and so on), and I can't believe it is out of the question
for some refactoring program to pick out and change identifiers in source
code which are now new keywords.

< Would u8 and unsigned char be
different names for the same type, or distinct types?

I have had u8 and c8 as distinct types, but there was very little advantage
(and that wouldn't apply in C). I would now make them the same (if char is
going to stay an 8-bit type).
Would those
specific types be mandatory for all implementations?

I don't see a problem with having the reserved words in place. Although not
all the types need be supported.
 
B

BartC

Robert Wessel said:
Java has never been an interpreted language. The original
implementations of the JVM all strictly interpreted the byte codes, as
opposed to more modern JVMs that just-in-time compile many of the byte
codes and execute then in that form (although mostly, if not all, JVMs
fall back on byte code interpretation in some cases). But the
language has always been compiled (to byte codes).

I think these days, 'interpreted' means, at the very least, working on
compiled byte-code, not pure source code. (I doubt Java was dynamic enough
anyway to interpret directly from source.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,075
Messages
2,570,562
Members
47,197
Latest member
NDTShavonn

Latest Threads

Top