The illusion of "portability"

J

jacob navia

In this group there is a bunch of people that call themselves 'regulars'
that insist in something called "portability".

Portability for them means the least common denominator.

Write your code so that it will compile in all old and broken
compilers, preferably in such a fashion that it can be moved with no
effort from the embedded system in the coffe machine to the 64 bit
processor in your desktop.

Sure, you can do that. But as you know, there is no free lunch.
You pay for that "portability" by missing all the progress done
since 1989 in C.

Note that there is objectively speaking not a single useful
program in C that can be ported to all machines that run the
language.

Not even the classic

int main(void) { printf("hello\n");}

Why?

For instance, if we take that program above and we want to
know if our printf did write something to stdout, we have to write


int main(void) {
int r=printf("hello\n");
if (r < 0) {
// what do we do here ???
}
}

The error code returned by printf is nowhere specified. There is no
portable way for this program to know what happened.

Since printf returns a negative value for an i/o error OR for a
format error in the format string there is no portable way to
discriminate between those two possibilitiess either.

Obviously, network i/o, GUIs, threads, and many other stuff essential
for modern programming is completely beyond the scope of "standard C"
and any usage makes instantly your program non portable.

This means that effectively 100% of real C software is not portable to
all machines and that "portability" is at best a goal to keep in
mind by trying to build abstraction layers, but no more.

This is taken to ridiculous heights with the polemic against C99, by
some of those same 'regulars'.

They first start yelling about "Standard C", and then... they do not
mean standard C but some other obsolete standard. All that, in the name
of "portability".

Who cares about portability if the cost is higher than "usability"
and easy of programming?


jacob
 
R

Richard Heathfield

jacob navia said:
In this group there is a bunch of people that call themselves 'regulars'
that insist in something called "portability".

The term "regulars" is a common one to describe those who frequent a forum,
whether it be an IRC channel, a newsgroup, or whatever. You yourself are a
"regular" in comp.lang.c, whether you realise it or not.
Portability for them means the least common denominator.

Wouldn't it be better to ask "them" (whoever "they" are) what they mean by
portability, than to assume it? It is quite likely that the word is used in
slightly different ways by various people; it's not a very portable word.

Sure, you can do that. But as you know, there is no free lunch.
You pay for that "portability" by missing all the progress done
since 1989 in C.

There isn't all that much progress in C. What did C99 give us? Mixed
declarations? Sugar. // comments? More sugar. VLAs? More sugar. Compound
literals - sure, they might come in handy one day. A colossal math library?
Hardly anyone needs it, and those who do are probably using something like
Matlab anyway.

The real progress has been in the development of third-party libraries, many
of which are at least a bit cross-platform.
Note that there is objectively speaking not a single useful
program in C that can be ported to all machines that run the
language.

So what should I do with all mine? Delete them? Dream on.
Not even the classic

int main(void) { printf("hello\n");}

Why?

Because the behaviour is undefined. Sheesh.
Obviously, network i/o, GUIs, threads, and many other stuff essential
for modern programming is completely beyond the scope of "standard C"
and any usage makes instantly your program non portable.

Well, it certainly makes your program /less/ portable. I use wrappers around
sockets so that I at least have portability across the Win32/Linux divide.
This means that effectively 100% of real C software is not portable to
all machines and that "portability" is at best a goal to keep in
mind by trying to build abstraction layers, but no more.

Portability is itself an abstraction, and a rather fuzzy one at that. There
is a lot of grey in between "portable" and "non-portable".
This is taken to ridiculous heights with the polemic against C99, by
some of those same 'regulars'.

There isn't any polemic against C99. You have misunderstood. The position of
at least some of the clc regs is that C99 will be just fine - when it
arrives. But it hasn't yet arrived in sufficient volumes to make the switch
from C90 worthwhile.
They first start yelling about "Standard C", and then... they do not
mean standard C but some other obsolete standard. All that, in the name
of "portability".

Well, I'd rather conform to an obsolete standard that is supported by just
about all current C compilers, rather than to a standard that is not.
Who cares about portability if the cost is higher than "usability"
and easy of programming?

It's a trade-off, obviously. Some people will value portability more than
others. Those who don't value it very highly will wander off to newsgroups
dealing with their compiler, OS, or whatever. Those who do value it highly
tend to stick around here. Those who value it highly but who must also use
implementation-specific tricks on occasion can get the best of both worlds
- high quality platform-independent advice here, and platform-specific
advice in a platform-specific group. Sounds sensible to me.
 
F

Frederick Gotham

jacob navia posted:
In this group there is a bunch of people that call themselves 'regulars'
that insist in something called "portability".

Agreed.


Portability for them means the least common denominator.


"Portability" means "code which must compile successfully with every
compiler, and behave appropriately on all platforms".

Write your code so that it will compile in all old and broken
compilers, preferably in such a fashion that it can be moved with no
effort from the embedded system in the coffe machine to the 64 bit
processor in your desktop.


I'd aim for such.

Sure, you can do that. But as you know, there is no free lunch.
You pay for that "portability" by missing all the progress done
since 1989 in C.


Present(verb) arguments in support of this statement -- I would like to
debate this with you.

Note that there is objectively speaking not a single useful
program in C that can be ported to all machines that run the
language.


If you're talking about GUI programs, then perhaps yes.

But the actual core algorithmic code can be kept fully portable. I've
written programs where all the core code is fully portable.

Not even the classic

int main(void) { printf("hello\n");}

Why?

For instance, if we take that program above and we want to
know if our printf did write something to stdout, we have to write


int main(void) {
int r=printf("hello\n");
if (r < 0) {
// what do we do here ???
}
}

The error code returned by printf is nowhere specified. There is no
portable way for this program to know what happened.


Nor would I care what happened. If a system can't reliably print a few
miserable characters to the screen, then I wouldn't waste electricity by
plugging it in.

Since printf returns a negative value for an i/o error OR for a
format error in the format string there is no portable way to
discriminate between those two possibilitiess either.


It's the programmer's responsibility to ensure that the format string isn't
corrupt. (Nonetheless, my own compiler warns of such an error.)

Obviously, network i/o, GUIs, threads, and many other stuff essential
for modern programming is completely beyond the scope of "standard C"
and any usage makes instantly your program non portable.


Only the part of the program which deals with GUI, threads, etc.

The underlying algorithms can be (and should be where possible) fully
portable.

This means that effectively 100% of real C software is not portable to
all machines and that "portability" is at best a goal to keep in
mind by trying to build abstraction layers, but no more.


Again, you're only talking about system calls.

Many times, I have written a program in fully portable code (in C++
albeit), and then progressed to write a platform-specific interface.

The core of my code remains fully portable.

Lately, I've begun to use GUI packages which one can use to compile
programs for systems such as Windows, Linux, Mac OS.
 
J

jacob navia

Richard Heathfield a écrit :
There isn't all that much progress in C. What did C99 give us?

True, C99 didn't really advance the language that much, but it has some
good points. Anyway, if we are going to stick to standard C, let's agree
that standard C is Standard C as defined by the standards comitee.
Mixed declarations? Sugar.
// comments? More sugar. VLAs? More sugar.

And what do you have against sugar?
You drink your coffee without it?

I mean, C is just syntatic sugar for assembly language. Why
not program in assembly then?

Mixed declarations are a progress in the sense that they put the
declaration nearer the usage of the variable, what makes reading
the code much easier, specially in big functions.

True, big functions are surely not a bright idea, but they happen :)

I accept that this is not a revolution, or a really big progress in C
but it is a small step, what is not bad.

VLAs are a more substantial step since it allows to allocate precisely
the memory the program needs without having to over-allocate or risk
under allocating arrays.

Under C89 you have to either:
1) allocate memory with malloc
2) Decide a maximum size and declare a local array of that size.

Both solutions aren't specially good. The first one implies using malloc
with all associated hassle, and the second risks allocating not enough
memory. C99 allows you to precisely allocate what you need and no more.
> Compound literals - sure, they might come in handy one day.

They do come handy, but again, they are not such a huge progress.
A colossal math library?
Hardly anyone needs it, and those who do are probably using something like
Matlab anyway.

Maybe, maybe not, I have no data concerning this. In any case it
promotes portability (yes, I am not that stupid! :) since it
defines a common interface for many math functions. Besides the control
you get from the abstracted FPU is very fine tuned.

You can portably set the rounding mode, for instance, and many other
things. In this sense the math library is quite a big step from C99.
The real progress has been in the development of third-party libraries, many
of which are at least a bit cross-platform.

That is a progress too, but (I do not know why) we never discuss them
in this group.

Maybe what frustrates me is that all this talk about "Stay in C89, C99
is not portable" is that it has taken me years of effort to implement
(and not all of it) C99 and that not even in this group, where we should
promote standard C C99 is accepted as what it is, the current standard.

I mean, each one of us has a picture of what C "should be". But if we
are going to get into *some* kind of consensus it must be the published
standard of the language, whether we like it or not.

For instance the fact that main() returns zero even if the programmer
doesn't specify it I find that an abomination. But I implemented that
because it is the standard even if I do not like it at all.


standard C
 
C

Chris F.A. Johnson

In this group there is a bunch of people that call themselves 'regulars'
that insist in something called "portability".

Portability for them means the least common denominator.

Portability means the highest possible return for your effort.
Write your code so that it will compile in all old and broken
compilers, preferably in such a fashion that it can be moved with no
effort from the embedded system in the coffe machine to the 64 bit
processor in your desktop.

Sure, you can do that. But as you know, there is no free lunch.
You pay for that "portability" by missing all the progress done
since 1989 in C.

There's surprisingly little that makes programming C99 better than C89.
Note that there is objectively speaking not a single useful
program in C that can be ported to all machines that run the
language.

That's strange. I have programs that I first wrote 20 years ago on
the Amiga, that I have compiled and run successfully, without any
changes, on MS-DOS, SunOS 4, FreeBSD, NetBSD, BSDi, and GNU/Linux.

I expect they would compile and execute successfully on any
standard C implementation.
 
J

jacob navia

Frederick Gotham a écrit :
Present(verb) arguments in support of this statement -- I would like to
debate this with you.

1) VLAs allow you to precisely allocate the memory the program needs
instead of using malloc (with all its associated problems) or having
to decide a maximum size for your local array, allocating too much
for most cases.

int fn(int n)
{
int tab[n];
}

allocates JUST what you need AT EACH CALL.

2) The math library is improved BIG time.
2A) You can portably set the rounding mode for instance, what you
could never do in C89 without using some compiler specific
stuff.
2B) Many new math functions allow you to reduce the number of
compiler dependent stuff in your code.

2C) The generic math feature allows you to change the precision
used by your program easily.
3) Mixing declarations and code allows you to declare variables
near the usage of it, making code more readable.

This are some points. There are others.
 
J

jacob navia

Chris F.A. Johnson a écrit :
That's strange. I have programs that I first wrote 20 years ago on
the Amiga, that I have compiled and run successfully, without any
changes, on MS-DOS, SunOS 4, FreeBSD, NetBSD, BSDi, and GNU/Linux.

I expect they would compile and execute successfully on any
standard C implementation.

The Amiga system is not an embedded system, and it is many ways very
similar to other command line environments.

I am not telling you that portable programs do not exists or that
it is not worthwhile trying to attain some degree of independence
from the underlying system. I am telling you that (as everything)
portability has some associated COST!
 
J

John Bode

jacob said:
In this group there is a bunch of people that call themselves 'regulars'
that insist in something called "portability".

Portability for them means the least common denominator.

It primarily means that conforming compilers have been implemented on a
wide variety of hardware and OS combinations, so that conforming code
on one platform can be expected to behave the same on any other
platform.

Secondarily, it means structuring your code so that it supports
multiple platforms concurrently with minimal effort, which I've had to
do on numerous occasions (the most painful being classic MacOS,
Solaris, and Windows 3.1).

And as far as supporting the "least common denominator", it's not my
fault that Microsoft went out of its way to make it nigh impossible to
code for Windows and *anything else* by "extending" C to such a
ridiculous degree. Nor is it my fault that the bulk of the lossage was
on the Windows side.
Write your code so that it will compile in all old and broken
compilers, preferably in such a fashion that it can be moved with no
effort from the embedded system in the coffe machine to the 64 bit
processor in your desktop.

What "old and broken" compilers are you referring to, jacob?
Sure, you can do that. But as you know, there is no free lunch.
You pay for that "portability" by missing all the progress done
since 1989 in C.

Really? How so?
Note that there is objectively speaking not a single useful
program in C that can be ported to all machines that run the
language.

I beg to differ; I've written them. It *is* possible to write useful,
conforming apps. Not everything needs to run through a GUI.
Not even the classic

int main(void) { printf("hello\n");}

Why?

For instance, if we take that program above and we want to
know if our printf did write something to stdout, we have to write


int main(void) {
int r=printf("hello\n");
if (r < 0) {
// what do we do here ???
}
}

The error code returned by printf is nowhere specified. There is no
portable way for this program to know what happened.

That's only sort of true; the return value is EOF if an error occurs,
otherwise the value is not EOF. So rewrite the above as

int main(void)
{
int r = printf("hello\n");
if (r == EOF)
{
/* handle error */
}

return 0;
}
Since printf returns a negative value for an i/o error OR for a
format error in the format string there is no portable way to
discriminate between those two possibilitiess either.

Obviously, network i/o, GUIs, threads, and many other stuff essential
for modern programming is completely beyond the scope of "standard C"
and any usage makes instantly your program non portable.

Which is why you wrap those sections.

Abstraction is a Good Thing, anyway.
This means that effectively 100% of real C software is not portable to
all machines and that "portability" is at best a goal to keep in
mind by trying to build abstraction layers, but no more.

This is taken to ridiculous heights with the polemic against C99, by
some of those same 'regulars'.

They first start yelling about "Standard C", and then... they do not
mean standard C but some other obsolete standard. All that, in the name
of "portability".

Who cares about portability if the cost is higher than "usability"
and easy of programming?

Talk to me when you've had to support Linux, MacOS, Windows, and MPE
concurrently.
 
K

Keith Thompson

jacob navia said:
Frederick Gotham a écrit :

Or you pay for a few C99-specific features by losing a significant
degree of portability.

As many of us have been saying for a very long time, it's a tradeoff,
and different users will make different decisions about that tradeoff.
Pretending that it isn't, or that the choice is obvious, is
disingenuous.
Present(verb) arguments in support of this statement -- I would like
to debate this with you.

1) VLAs allow you to precisely allocate the memory the program needs
instead of using malloc (with all its associated problems) or having
to decide a maximum size for your local array, allocating too much
for most cases.

int fn(int n)
{
int tab[n];
}

allocates JUST what you need AT EACH CALL.

Yes. One cost is that there is no mechanism for handling allocation
failures. If I use malloc(), I can check whether it returned a null
pointer, and perhaps do something to handle the error (I can at least
shut down the program cleanly). With a VLA, if the allocation fails,
I get undefined behavior. In most environments, I'd expect this to
abort the program with an error message (not giving me a chance to do
any final cleanup) -- but the C99 standard allows arbitrarily bad
behavior.
2) The math library is improved BIG time.
2A) You can portably set the rounding mode for instance, what you
could never do in C89 without using some compiler specific
stuff.
2B) Many new math functions allow you to reduce the number of
compiler dependent stuff in your code.

2C) The generic math feature allows you to change the precision
used by your program easily.

I don't do much math programming, so I don't know how useful that is.

As a practical matter, this depends on a C99-conforming runtime
library, which is often a separate issue from the conformance of the
compiler.
3) Mixing declarations and code allows you to declare variables
near the usage of it, making code more readable.

I agree that it's a nice feature, but it's easy to work around it when
it's missing. (Some people prefer not to mix declarations and
statements, thinking that keeping them separate improves program
structure; I don't necessarly agree, but I can see the point.)

Incidentally, the word "code" is ambiguous; it often refers to
everything in a C source files, not just statements. "Mixing
declarations and statements" is clearer.
 
K

Keith Thompson

jacob navia said:
Chris F.A. Johnson a écrit :

The Amiga system is not an embedded system, and it is many ways very
similar to other command line environments.

I am not telling you that portable programs do not exists or that
it is not worthwhile trying to attain some degree of independence
from the underlying system. I am telling you that (as everything)
portability has some associated COST!

And if you had bothered to mention that to begin with, we probably
wouldn't be having this argument.

There is a tradeoff. Ignoring either side of that tradeoff is
foolish.

Yes, portability has a cost, in that you can't use C99 features.

Conversely, using C99 features has a cost, in that you lose a
significant degree of portability.
 
F

Frederick Gotham

jacob navia posted:
1) VLAs allow you to precisely allocate the memory the program needs
instead of using malloc (with all its associated problems) or having
to decide a maximum size for your local array, allocating too much
for most cases.

int fn(int n)
{
int tab[n];
}

allocates JUST what you need AT EACH CALL.


C99 added a few new features to C.

C++ added a boat load of new features to C.

Even C++ doesn't have VLA's, because it finds efficiency to be more
valuable.

You can do the following in C++:

unsigned const len = 5;
int array[len];

But you *can't* do the following:

unsigned len = GetValueAtCompileTime();
int array[len];

The length of an array must be a compile-time constant.

Arrays whose length is known at compile time are far more efficient to work
with. Therefore, in C++, they decreed that one should be explicit about
dynamic memory allocation:

unsigned len = GetValueAtRuntime();

int *p = new unsigned[len];

delete p;

Or the C equivalent:

int *p = malloc(len * sizeof *p);
free(p);

I simply just don't like VLA's, and will never use them.

2) The math library is improved BIG time.
2A) You can portably set the rounding mode for instance, what you
could never do in C89 without using some compiler specific
stuff.
2B) Many new math functions allow you to reduce the number of
compiler dependent stuff in your code.
2C) The generic math feature allows you to change the precision
used by your program easily.


I haven't written maths-intensive programs, so I'm not qualified to comment
on this.

3) Mixing declarations and code allows you to declare variables
near the usage of it, making code more readable.


Yes, I like to define variables right where I need them.

NB: I wasn't arguing about the advantages of using C99 over using C89, but
rather any perceived disadvantages (efficiency wise) in writing strictly
portable code.
 
K

Keith Thompson

Frederick Gotham said:
jacob navia posted: [...]
Not even the classic

int main(void) { printf("hello\n");}

Why?

For instance, if we take that program above and we want to
know if our printf did write something to stdout, we have to write


int main(void) {
int r=printf("hello\n");
if (r < 0) {
// what do we do here ???
}
}

The error code returned by printf is nowhere specified. There is no
portable way for this program to know what happened.


Nor would I care what happened. If a system can't reliably print a few
miserable characters to the screen, then I wouldn't waste electricity by
plugging it in.
[...]

Several points.

What screen?

Both programs above invoke undefined behavior, because they both call
printf() with no prototype in scope. The fix is to add
"#include <stdio.h>" to the top of each.

printf() *can* fail. For example, what if the program's stdout is
redirected (in some system-specific manner) to a disk file, and the
file system has run out of space? If you check the result of printf()
for errors, there are several things you can do. You can try to print
an error message to stderr, which may succeed even if printing to
stdout has failed. Or you can just abort the program with
"exit(EXIT_FAILURE);".

Or, as most programs do, you can ignore the error and blindly continue
running. (I admit this is what I usually do myself.)
 
B

Ben Pfaff

Keith Thompson said:
printf() *can* fail. For example, what if the program's stdout is
redirected (in some system-specific manner) to a disk file, and the
file system has run out of space? If you check the result of printf()
for errors, there are several things you can do. You can try to print
an error message to stderr, which may succeed even if printing to
stdout has failed. Or you can just abort the program with
"exit(EXIT_FAILURE);".

One reasonable option may be to flush stdout before exiting the
program, then call ferror to check whether there was an error.
If there was, terminate the program with an error (after
attempting to report it to stderr).

Some of my programs do this, but only the ones that I care about
a lot.
 
F

Frederick Gotham

Keith Thompson posted:
Or, as most programs do, you can ignore the error and blindly continue
running. (I admit this is what I usually do myself.)


I do this myself in quite a few places.

For instance, if I were allocating upwards of a megabyte of memory, I would
probably take precautions:

int *const p = malloc(8388608 / CHAR_BIT
+ !!(8388608 % CHAR_BIT));

if(!p) SOS();

But if I'm allocating less than a kilobytes... I probably won't bother most
of the time. (The system will already have ground to a halt by that stage if
it has memory allocation problems.)
 
J

jacob navia

John Bode a écrit :
Talk to me when you've had to support Linux, MacOS, Windows, and MPE
concurrently.

lcc-win32 has customers under linux, windows and many embedded systems
without any OS (or, to be more precise, with an OS that is part
of the compiled program)
 
D

dcorbit

jacob said:
In this group there is a bunch of people that call themselves 'regulars'
that insist in something called "portability".

Portability for them means the least common denominator.

Portability means never having to say you're sorry. Portability means
conforming to a formally accepted written standard which defines how
things ought to behave.
Write your code so that it will compile in all old and broken
compilers, preferably in such a fashion that it can be moved with no
effort from the embedded system in the coffe machine to the 64 bit
processor in your desktop.

Do you really overlook the value of being able to do that?
Sure, you can do that. But as you know, there is no free lunch.
You pay for that "portability" by missing all the progress done
since 1989 in C.

Portable can be portable to C89, to C99 or to an implementation
document. The degree of portability achieved will depend upon how well
accepted and debugged the standard in use was. If I write to the C99
standard, I am writing portable code. It is portable to C99.
Note that there is objectively speaking not a single useful
program in C that can be ported to all machines that run the
language.

How about a useful program in C that runs on 100 million machines of
various architectures and is maintained over a 20 year period. Does
that sound useful to you?
Not even the classic

int main(void) { printf("hello\n");}

Why?

Other than the lack of a prototype for printf(), I don't see anything
terribly wrong with it. The big problem with failure of printf() is
that we are at a bit of a loss as to how to report the problem n'est ce
pas?
For instance, if we take that program above and we want to
know if our printf did write something to stdout, we have to write


int main(void) {
int r=printf("hello\n");
if (r < 0) {
// what do we do here ???
}
}

The error code returned by printf is nowhere specified. There is no
portable way for this program to know what happened.

You know that the printf() failed. You may not know why and perror()
may or may not be useful. How would you go about repairing this
defect?
Since printf returns a negative value for an i/o error OR for a
format error in the format string there is no portable way to
discriminate between those two possibilitiess either.

For a format error, it is possible to know because you can check your
format.
Obviously, network i/o, GUIs, threads, and many other stuff essential
for modern programming is completely beyond the scope of "standard C"
and any usage makes instantly your program non portable.

It makes the part of the program that uses network I/O, a GUI, threads
or other essential tasks non-portable. Here, we will generally resort
to another standard. We can use TCP/IP for network programming. We
can use POSIX threads for threading. For the GUI, we may have to use a
proprietary standard like wxWidgets or an operating system specific API
like the Windows API. In each of these cases we are still doing
standards based computing, but we are not using ANSI/ISO standards for
the parts not covered by the more fundamental level.
This means that effectively 100% of real C software is not portable to
all machines and that "portability" is at best a goal to keep in
mind by trying to build abstraction layers, but no more.

Did you know that 99.997% of all statistics are made up?
I work on a software system with hundreds of thousands of lines of
code.
It runs on Solaris, AIX, Windows, Linux, MVS, OpenVMS (and many others)
against dozens of database systems. Do you imagine that such a thing
would be remotely feasible without paying detailed attention to
standards?
This is taken to ridiculous heights with the polemic against C99, by
some of those same 'regulars'.

I agree that C99 is a favorite whipping boy for no reason that I can
glean. There are not a lot adopters, but many of the features are very
desirable. VLAs (in particular) are worth their weight in gold.
They first start yelling about "Standard C", and then... they do not
mean standard C but some other obsolete standard. All that, in the name
of "portability".

C99 is standard C. Other standards must be prefaced by the name of the
standard (IMO-YMMV). That's because C99 legally replaces the previous
C standard.
Who cares about portability if the cost is higher than "usability"

Nobody does.
and easy of programming?

Adhering to standards makes programming much, much easier. I
programmed in C before there was any formal standard approved. It was
really awful, and every implementation was so different that you had to
completely rewrite things constantly to go from one compiler vendor to
the next, even on the same physical architecture.

I don't understand why anyone would complain against standards.
Programming is practically impossible without them.
 
J

jacob navia

Keith Thompson a écrit :
Or you pay for a few C99-specific features by losing a significant
degree of portability.

Well but this group is about STANDARD C or not?

If we do not agree about what standard C is, we can use the standard.

But if we do not even agree what standard C is ther can't be
any kind of consensus in this group you see?

The talk about "Standard C" then, is just hollow words!!!
 
B

Ben Pfaff

Frederick Gotham said:
For instance, if I were allocating upwards of a megabyte of memory, I would
probably take precautions:

int *const p = malloc(8388608 / CHAR_BIT
+ !!(8388608 % CHAR_BIT));

if(!p) SOS();

But if I'm allocating less than a kilobytes... I probably won't bother most
of the time. (The system will already have ground to a halt by that stage if
it has memory allocation problems.)

Why not use a wrapper function that will always do the right
thing?
 
J

jacob navia

(e-mail address removed) a écrit :
C99 is standard C. Other standards must be prefaced by the name of the
standard (IMO-YMMV). That's because C99 legally replaces the previous
C standard.

This is the most important thing I wanted with my previous message.

That we establish a consensus here about what standard C means.

And it can't mean anything else as the *current* C standard.

I have been working for years in a C99 implementation. and I wanted
that at least in this group, that is supposed to be centered around
standard C we establish that C99 *is* the standard weven if we do
not like this or that feature.
 
D

dcorbit

[snip]
Arrays whose length is known at compile time are far more efficient to work
with.

I doubt this statement.

On stack based machines, it's nothing more than a subtraction. Whether
the value is passed in or known at compile time makes no difference.

[snip]
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Members online

Forum statistics

Threads
473,995
Messages
2,570,233
Members
46,820
Latest member
GilbertoA5

Latest Threads

Top