The illusion of "portability"

J

jacob navia

(e-mail address removed) a écrit :
[snip]
Arrays whose length is known at compile time are far more efficient to work
with.


I doubt this statement.

On stack based machines, it's nothing more than a subtraction. Whether
the value is passed in or known at compile time makes no difference.

[snip]

I implemented this by making the array a pointer, that
gets its value automagically when the function starts by making
a subtraction from the stack pointer. Essentially

int fn(int n)
{
int tab[n];
....
}

becomes

int fn(int n)
{
int *tab = alloca(n*sizeof(int));
}

The access is done like any other int *...
 
T

Tom St Denis

jacob said:
In this group there is a bunch of people that call themselves 'regulars'
that insist in something called "portability".

Portability for them means the least common denominator.

No it means making wise design choices.
Write your code so that it will compile in all old and broken
compilers, preferably in such a fashion that it can be moved with no
effort from the embedded system in the coffe machine to the 64 bit
processor in your desktop.

Why? I target C99 with my code. To me "portable" means C99. I
consider it a hack if I have to support something outside of it [e.g.
Visual C lack of long long for instance].
Sure, you can do that. But as you know, there is no free lunch.
You pay for that "portability" by missing all the progress done
since 1989 in C.

C99 programs are portable.

<snip nonsense>

Even with all the latest doodahs of C99 you still are not assured to
have a TTY or a console, file system or even a heap, etc...

Big deal?

No one expects a 3D video game to work on an 8051.

On the otherhand, one would expect some non-interface type routine to
work anywhere.

My own math lib [LibTomMath] has been built on things as low as an 8086
with TurboC all the way up through the various 64-bit servers and
consoles. Without sacrificing too much speed or ANY functionality.

Similarly my crypto code is used pretty much anywhere without hacking
for this compiler, that compiler, etc.

Tom
 
K

Keith Thompson

John Bode said:
jacob navia wrote: [...]
Not even the classic

int main(void) { printf("hello\n");}

Why?

For instance, if we take that program above and we want to
know if our printf did write something to stdout, we have to write


int main(void) {
int r=printf("hello\n");
if (r < 0) {
// what do we do here ???
}
}

The error code returned by printf is nowhere specified. There is no
portable way for this program to know what happened.

That's only sort of true; the return value is EOF if an error occurs,
otherwise the value is not EOF. So rewrite the above as

int main(void)
{
int r = printf("hello\n");
if (r == EOF)
{
/* handle error */
}

return 0;
}
[...]

That's incorrect. C99 7.19.6.3p3:

The printf function returns the number of characters transmitted,
or a negative value if an output or encoding error occurred.

The "if (r < 0)" test is correct.
 
J

J. J. Farrell

jacob said:
I am not telling you that portable programs do not exists or that
it is not worthwhile trying to attain some degree of independence
from the underlying system. I am telling you that (as everything)
portability has some associated COST!

Why are you telling us something that's blatantly obvious, and that we
all know?

Since we're stating the obvious, the question to be asked on each
occasion is whether the COST of portability exceeds the COST of
non-portability. In my experience, for the sort of work I do, it's
always been better to steer hard towards the portable end of the range.
 
A

Andrew Poelstra

Why not use a wrapper function that will always do the right
thing?

What would such a wrapper do? I've written a few that do things like
attempt to settle for less memory, return memory from a pre-allocated
buffer (if malloc() succeeded, I'd take a little extra while the getting
was good), or in one case I informed the user and gave him the option to
either kill other memory-intensive programs or simply die.

However, the second of those options requires an equivilant wrapper for
free() because if I'm manually hacking memory around it's far too easy
to end up with UB. The others aren't acceptable in certain situations.

More importantly, no matter what you do, the function calling the
wrapper needs to do essentially the same stuff as a function calling
malloc() directly in the case of a critical memory error.
 
K

Keith Thompson

Frederick Gotham said:
jacob navia posted:
1) VLAs allow you to precisely allocate the memory the program needs
instead of using malloc (with all its associated problems) or having
to decide a maximum size for your local array, allocating too much
for most cases.

int fn(int n)
{
int tab[n];
}

allocates JUST what you need AT EACH CALL.


C99 added a few new features to C.
Yes.

C++ added a boat load of new features to C.

Yes and no. C++ has a boat load of features that aren't in C, but it
didn't add them to C; it added them to C++. Yes, I'm being
ridiculously picky about wording, but it is an important distinction.
C++ did not attempt to *replace* C. C99, in a very real sense, did.
Even C++ doesn't have VLA's, because it finds efficiency to be more
valuable.

<OT>
I don't believe that's the reason. The C++98 standard is based on the
C90 standard, which doesn't/didn't have VLAs, and C++ didn't add them.
I would be completely unsurprised if a future C++ standard adopted
VLAs from C99.

On the other hand, the C++ standard library provides other features
that can be used in place of VLAs.
</OT>
 
K

Keith Thompson

jacob navia said:
(e-mail address removed) a écrit :
[snip]
Arrays whose length is known at compile time are far more efficient to work
with.
I doubt this statement.
On stack based machines, it's nothing more than a subtraction.
Whether
the value is passed in or known at compile time makes no difference.
[snip]

I implemented this by making the array a pointer, that
gets its value automagically when the function starts by making
a subtraction from the stack pointer. Essentially

int fn(int n)
{
int tab[n];
...
}

becomes

int fn(int n)
{
int *tab = alloca(n*sizeof(int));
}

The access is done like any other int *...

If that's literally true, then sizeof tab will yield an incorrect
result.
 
F

Flash Gordon

jacob said:
Keith Thompson a écrit :

Well but this group is about STANDARD C or not?

If we do not agree about what standard C is, we can use the standard.

But if we do not even agree what standard C is ther can't be
any kind of consensus in this group you see?

The talk about "Standard C" then, is just hollow words!!!

As you know very well we discuss C99, C95, C90 and even pre-ANSI C when
appropriate. Why do you object so strongly to people not being told when
something is a C99 feature and so not portable to such common
implementations as MS VC++?

Personally I would *far* rather use C99 but I have to support MS VC++
and at least two older versions -f gcc and glibc plus the C library on
another OS that does not support C99 as yet. Oh, and I could be asked at
any time about supporting some other OS for which there might not be a
C99 compiler around. Such situations are very common.

So if you want to discus something about C99 go ahead. If you want to
tell people that in C99 they can do something go ahead. However, don't
claim that everyone can use C99 or make false claims about
implementations supporting C99 when they don't.
 
K

Keith Thompson

Frederick Gotham said:
Keith Thompson posted:

I do this myself in quite a few places.

For instance, if I were allocating upwards of a megabyte of memory, I would
probably take precautions:

int *const p = malloc(8388608 / CHAR_BIT
+ !!(8388608 % CHAR_BIT));

if(!p) SOS();

But if I'm allocating less than a kilobytes... I probably won't bother most
of the time. (The system will already have ground to a halt by that stage if
it has memory allocation problems.)

In my opinion, that's an extremely unwise approach.

In many cases, the fact that your program is running out of memory
will have no effect on the system as a whole; many systems place
limits on how much memory a single program (<OT>process</OT>) may
allocate, and those limits are typically much smaller than the total
amount of memory available to the system. This is particularly true
for multi-user systems.

You should always check the result of every call to malloc(). If you
don't want to do the check explicitly, write a wrapper.
 
K

Keith Thompson

Ben Pfaff said:
One reasonable option may be to flush stdout before exiting the
program, then call ferror to check whether there was an error.
If there was, terminate the program with an error (after
attempting to report it to stderr).

Some of my programs do this, but only the ones that I care about
a lot.

Yes, that's probably better than silently ignoring the error.

One possible drawback is that it doesn't catch the error until the
program is just about to terminate.
 
A

Al Balmer

(e-mail address removed) a écrit :

This is the most important thing I wanted with my previous message.

That we establish a consensus here about what standard C means.
There's no consensus needed about a matter of fact which nobody
contests. However, we can understand that C99 is the current C
standard while also recognizing that it is not universally implemented
and that maximum portability often means that we must forgo some or
all of the new features it introduced. (That's a rhetorical "we" and
does not necessarily include you.)

We (most of us) also understand that portability is a worthwhile goal,
often more important to us and our work than being able to use all the
latest features of the language.
And it can't mean anything else as the *current* C standard.

I have been working for years in a C99 implementation. and I wanted
that at least in this group, that is supposed to be centered around
standard C we establish that C99 *is* the standard weven if we do
not like this or that feature.

Why do you think this needs to be "established"? It's self-evident and
no one is contesting it.

In this thread, what has been contested is much of the ridiculous
rhetoric you indulged in in your initial post. You now seem to have
changed the subject.
 
K

Keith Thompson

jacob navia said:
(e-mail address removed) a écrit :

This is the most important thing I wanted with my previous message.

That we establish a consensus here about what standard C means.

And it can't mean anything else as the *current* C standard.

First, I disagree with the use of the term "legally". ISO is not a
governmental body, and the C standard does not have the force of law.
Nobody is going to arrest a user or an implementer for failing to
conform to it.

Standards do not exist in a vacuum. The purpose of a language
standard is to provide a contract (I don't necessarily mean that in
any legal sense) between users and implementers. If implementers
provide implementations that conform to the standard, and if users
write code that conforms to the standard, then that code will work
correctly with those implementations.

It is a fact (and, IMHO, an unfortunate one) that the C99 standard has
not been as successful as the C90 standard. There are reasons for
this; I won't repeat them here. But the result is that, in the real
world, I can write code that conforms to the C99 standard, but not be
able to get it to work properly on some platforms that I care about.
If I instead write code that conforms to both the C90 and C99
standards, I have a much better chance of getting it to work portably.
I have been working for years in a C99 implementation. and I wanted
that at least in this group, that is supposed to be centered around
standard C we establish that C99 *is* the standard weven if we do
not like this or that feature.

This isn't about whether we like or dislike any particular features of
C99. The issue is that those features *are not as widely available*
as the features defined by the C90 standard.

Your approach seems to be just to ignore this fact, and encourage
users to write C99-dependent code without worrying about portability.
Most of the rest of us, on the other hand, prefer to let people know
what the tradeoffs are.

There's plenty of discussion of C99 here. I regularly post quotations
from the standard; when I do, they're usually from the C99 standard or
from the n1124 draft. If someone posts code that mixes declarations
and statements, we don't say that that's illegal in C; rather we say,
truthfully, that it's legal in C99 but illegal in C90, and *explain
the tradeoffs*.

Back in the early 1990s, I'm sure you would have found plenty of
advice in this newsgroup (or its ancestor, net.lang.c; I don't
remember when the transition was) about programming in K&R C, and
writing code that's legal in both K&R C and ANSI C. The 1990 ISO C
standard had been released, and it was the *official* definition of
the language, but real-world programmers still had to deal with the
fact that it wasn't yet universally supported.

We don't stop talking about an old standard when a new one comes out;
we stop talking about an old standard when it becomes irrelevant. The
C90 standard is still very relevant.

(comp.std.c has an even stronger emphasis on C99, since any future
standards or technical corrigenda will be based on C99, not C90.)
 
K

Keith Thompson

jacob navia said:
Keith Thompson a écrit :

Well but this group is about STANDARD C or not?

If we do not agree about what standard C is, we can use the standard.

But if we do not even agree what standard C is ther can't be
any kind of consensus in this group you see?

The talk about "Standard C" then, is just hollow words!!!

We talk about Standard C (C99) all the time. We're doing so right now.

We also talk about the previous standard, and occasionally about the
de facto standard before that (K&R1, Appendix A).
 
K

Keith Thompson

Andrew Poelstra said:
What would such a wrapper do?

In the simplest case, it could abort the program with an error message
if malloc() fails. This isn't ideal, but it's certainly better than
ignoring an allocation failure.
 
K

Keith Thompson

Flash Gordon said:
As you know very well we discuss C99, C95, C90 and even pre-ANSI C
when appropriate. Why do you object so strongly to people not being
told when something is a C99 feature and so not portable to such
common implementations as MS VC++?

I think you meant to drop the first "not" in that last sentence;
"people not being told" should be "people being told".
 
D

dcorbit

Keith said:
First, I disagree with the use of the term "legally". ISO is not a
governmental body, and the C standard does not have the force of law.
Nobody is going to arrest a user or an implementer for failing to
conform to it.

ANSI is connected to the U.S. government and (along with NIST) is used
to establish standards in the United States. While standard adoption
is 'voluntary' there can still be legal ramifications. For instance,
if I stamp the head of my bolt with a certain number of diamond shapes
on it, that is a claim of adherence to a standard. If the claim is
false, the manufacturer could get sued, possibly go to jail, etc.
Sometimes, formal standards do get adopted with more or less legal
weight, which would vary from country to country and standard to
standard. I can easily imagine legal problems for a C compiler vendor
who claimed in their advertizing that their compiler adhered to the
ANSI/ISO C standard but in fact, failed badly on many measures.

Here is an interesting quote that I found:
"Standards, unlike many other technical papers and reports, are
quasi-legal documents. Standards are used as evidence, either to
substantiate or refute points, in courts of law. Standards also become
legal documents if adopted by various governments or regulatory
agencies. When this happens, the content and decisions in a standard
carry more weight, and the process by which they are developed falls
under much more scrutiny, making ANSI accreditation especially
valuable."

I am certainly in wholehearted agreement with the main thrust of your
post, but wanted to point out a nuance of potential legal ramification,
though the documents themselves do not embody law.
[snip]
 
D

Dik T. Winter

>
> I doubt this statement.
>
> On stack based machines, it's nothing more than a subtraction. Whether
> the value is passed in or known at compile time makes no difference.

If you allocate runtime lenghth arrays on the stack, you need indirection
if you have to allocate more than one such arrays at a level (especially
if there are multi-dimensional arrays involved). When Algol-60 introduced
runtime length arrays there have been reports written and long discussions
about how to implement it. It was not for nothing that Wirth removed them
in Pascal.
 
D

Dik T. Winter

> I implemented this by making the array a pointer, that
> gets its value automagically when the function starts by making
> a subtraction from the stack pointer. Essentially

Yup, standard since about 1960 when they were introduced. But:
> int fn(int n)
> {
> int *tab = alloca(n*sizeof(int));
> }
>
> The access is done like any other int *...

So the access is inherently less efficient (and that was the discussion
about).
 
K

Keith Thompson

Keith said:
First, I disagree with the use of the term "legally". ISO is not a
governmental body, and the C standard does not have the force of law.
Nobody is going to arrest a user or an implementer for failing to
conform to it.

ANSI is connected to the U.S. government and (along with NIST) is used
to establish standards in the United States. While standard adoption
is 'voluntary' there can still be legal ramifications. For instance,
if I stamp the head of my bolt with a certain number of diamond shapes
on it, that is a claim of adherence to a standard. If the claim is
false, the manufacturer could get sued, possibly go to jail, etc.
Sometimes, formal standards do get adopted with more or less legal
weight, which would vary from country to country and standard to
standard. I can easily imagine legal problems for a C compiler vendor
who claimed in their advertizing that their compiler adhered to the
ANSI/ISO C standard but in fact, failed badly on many measures.

Here is an interesting quote that I found:
"Standards, unlike many other technical papers and reports, are
quasi-legal documents. Standards are used as evidence, either to
substantiate or refute points, in courts of law. Standards also become
legal documents if adopted by various governments or regulatory
agencies. When this happens, the content and decisions in a standard
carry more weight, and the process by which they are developed falls
under much more scrutiny, making ANSI accreditation especially
valuable."

I am certainly in wholehearted agreement with the main thrust of your
post, but wanted to point out a nuance of potential legal ramification,
though the documents themselves do not embody law.
[snip]

Ok, that's a good point.

It seems to me (I'm hardly an expert) that this is partly a specific
case of the more general principle that if you falsely claim
conformance to something, you've committed fraud. For example, if I
sell widgets while claiming that each widget conforms to Ralph's
Pretty Good Widget Standard, I can be sued by my customers if in fact
I've deliberately violated clause 4.2.

On the other hand, that's not all there is to it; an ANSI or ISO
standard undoubtedly has some quasi-legal standing beyond that enjoyed
by any of Ralph's Pretty Good Standards.

Whether the "quasi" or the "legal" part is more significant depends on
what you're talking about.
 
D

Dik T. Winter

> John Bode a écrit :
>
> lcc-win32 has customers under linux, windows and many embedded systems
> without any OS (or, to be more precise, with an OS that is part
> of the compiled program)

Where can I find lcc-win32 for linux?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,236
Members
46,821
Latest member
AleidaSchi

Latest Threads

Top