sizeof

K

Keith Thompson

lovecreatesbeauty said:
Richard said:
There is simply no way that the compiler can know at compile time how many
elements the user will ask for at run time.

Is following the inconsistent between the standard and implementations?

#include <stdio.h>

int main(void)
{
int len = 10;
char a[len];

switch (sizeof a)
{
case 10:
printf("%s", "10 elements");
break;
default:
printf("<unknow>");
break;
}

fflush(stdout);

return 0;
}

$ gcc test.c
$ ./a.out
10 elements$
$

lovecreatesbeauty

/*quoting begins*/
6.6 Constant expressions
2 A constant expression can be evaluated during translation rather than
runtime, <snip>

6.8.4.2 The switch statement
3 The expression of each case label shall be an integer constant
expression <snip>
/*quoting ends*/

No, there's no inconsistency. The only thing that 6.8.4.2 applies to
in the program is the literal 10 in "case 10:"; that's clearly an
integer constant expression. The expression in parentheses following
the keyword "switch" doesn't need to be constant (the switch statement
wouldn't be very useful if it did).

switch(rand()) {
...
}
 
B

Barry Schwarz


In the code fragment
int x[5];
int y;
y = sizeof x;
The right had side is a compile time constant that the compiler could
evaluate at compile time and replace with 20 (for a system with four
byte int).

In the C99 fragment
int y;
scanf("%d", &y);
int x[y];
y = sizeof x;
The right hand side cannot be evaluated until after the user enters a
value for y.


Remove del for email
 
R

Richard Heathfield

lovecreatesbeauty said:
C language has no template, namespace, :: and function overloading
things. Even you can't find any word same as "template", "namespace" or
"overload" in the standard document.

Dear Granny

To begin, you'll need to find an egg. A large chicken's egg would be good,
or any other egg of about the same size. You should be able to buy a box of
half a dozen eggs from that supermarket you've been using for the last
fifty years or so. Just ask the assistant.

You need to work out which is the smaller end, and which the larger end. The
easiest way to do this is to place the egg carefully on a table-top, and
spin it. The smaller end, being further from the centre, will move round a
lot quicker. Mark the smaller end with a little pencil mark. That's the end
you'll be sucking from.

Now you'll need to make two holes in the egg. The first hole is at the
sucking end, and that's for you to suck through. The second hole is at the
big end, and that's to let air flow into the shell, replacing everything
you suck (otherwise, even if you were able to suck anything out, which is
unlikely, the shell would then be crushed by air pressure).

Use a sewing needle to make a tiny hole in each end of the egg. Then enlarge
the hole at the sucking end until it's about a quarter of an inch wide. The
hole at the other end need only be about a twelfth of an inch wide. Make
sure the holes are at the very ends of the egg, and be careful not to break
the egg while you're doing this.

Lift the egg to your lips, making sure you have the proper end (the small
end) facing you. Make sure your lips are preventing the egg's shell from
going into your mouth. Tip the egg slightly upwards at the far end, so that
gravity pulls down the eggy stuff to your end.

Pretend the egg is a straw. Suck through it. Swallow the eggy stuff as it
pours into your mouth, trying hard not to worry about what it tastes like.
Keep sucking it until there's no eggy stuff left in the shell.

If you do this right, you'll get an empty, but intact, egg-shell. As a
reward, you may have a glass of water to help take away the foul taste.

Your loving grandson
 
S

Skarmander

Xicheng said:
Sorry, I should have said that the operand of sizeof() is interpolated
at compile-time.

"You keep using that word. I do not think it means what you think it means."

You're probably looking for "evaluated".

S.
 
M

Malcolm

Richard Heathfield said:

Here's one reason:

#include <stdio.h>
#include <stdlib.h>

int main(void)
{
puts("How many array elements would you like?");
char buf[32];
if(fgets(buf, sizeof buf, stdin) != NULL)
{
unsigned long n = strtoul(buf, NULL, 10);
if(n > 0)
{
int arr[n];
/* ... */

There is simply no way that the compiler can know at compile time how many
elements the user will ask for at run time.
You mean to say that your compiler won't figure out that
9999999999999999999999999999999 is the maximum the user can enter and
allocate an array accordingly?
 
A

Andrew Poelstra

Richard Heathfield said:
Not only that, but with C99
VLA's, sizeof /cannot/ be computer at compile time in some cases.
Why?

Here's one reason:

#include <stdio.h>
#include <stdlib.h>

int main(void)
{
puts("How many array elements would you like?");
char buf[32];
if(fgets(buf, sizeof buf, stdin) != NULL)
{
unsigned long n = strtoul(buf, NULL, 10);
if(n > 0)
{
int arr[n];
/* ... */

There is simply no way that the compiler can know at compile time how many
elements the user will ask for at run time.
You mean to say that your compiler won't figure out that
9,999,999,999,999,999,999,999,999,999,999 is the maximum the user can enter and
allocate an array accordingly?

I've never seen a system with that much memory. Even if it was allocating
an array of char, that's still almost 10 million million million terabytes.

I'll have to assume that you were joking. ;-)
 
J

Joe Wright

lovecreatesbeauty said:
Mr. Heathfield, is it relative?

lovecreatesbeauty
So much humour and irony seems to slip by us.

One of the older and time honored bits of advice to youngsters is "Don't
bother telling Granny how to suck eggs." She knows how. She's been doing
it since before she gave birth to your mother.

That Mr. Heathfield knows how (he does!) is interesting. :=)
 
R

Richard Heathfield

Malcolm said:
You mean to say that your compiler won't figure out that
9999999999999999999999999999999 is the maximum the user can enter and
allocate an array accordingly?

If you could store the value of a single bit on a hydrogen atom, and if you
managed to jam the atoms so close together that there were no inter-atomic
gaps (and somehow doing this without impairing your ability to read and
write the bits), 1e32 bytes of storage would require a memory unit 700
metres long, 700 metres wide, and 700 metres tall. For comparison, the
world's tallest building, the Taipei 101 Tower, is a mere 509 metres high.

The volume of this memory unit would be over 340 million cubic metres. For
comparison, the world's largest building (by usable volume), Boeing's
Everett Plant, near Seattle, is a mere 5.64 million cubic metres.

The memory unit would have a mass of 1.33 million tonnes, which is
considerably in excess of the mass of the Palace of the Parliament in
Romania, the world's most massive building.

That's what I call Big Iron.
 
I

Ian Collins

Richard said:
Malcolm said:



If you could store the value of a single bit on a hydrogen atom, and if you
managed to jam the atoms so close together that there were no inter-atomic
gaps (and somehow doing this without impairing your ability to read and
write the bits), 1e32 bytes of storage would require a memory unit 700
metres long, 700 metres wide, and 700 metres tall. For comparison, the
world's tallest building, the Taipei 101 Tower, is a mere 509 metres high.

The volume of this memory unit would be over 340 million cubic metres. For
comparison, the world's largest building (by usable volume), Boeing's
Everett Plant, near Seattle, is a mere 5.64 million cubic metres.

The memory unit would have a mass of 1.33 million tonnes, which is
considerably in excess of the mass of the Palace of the Parliament in
Romania, the world's most massive building.

That's what I call Big Iron.
The fun part would be putting a (very long) match to it.....
 
K

Keith Thompson

Richard Heathfield said:
Malcolm said:

If you could store the value of a single bit on a hydrogen atom, and if you
managed to jam the atoms so close together that there were no inter-atomic
gaps (and somehow doing this without impairing your ability to read and
write the bits), 1e32 bytes of storage would require a memory unit 700
metres long, 700 metres wide, and 700 metres tall. For comparison, the
world's tallest building, the Taipei 101 Tower, is a mere 509 metres high.

The volume of this memory unit would be over 340 million cubic metres. For
comparison, the world's largest building (by usable volume), Boeing's
Everett Plant, near Seattle, is a mere 5.64 million cubic metres.

The memory unit would have a mass of 1.33 million tonnes, which is
considerably in excess of the mass of the Palace of the Parliament in
Romania, the world's most massive building.

That's what I call Big Iron.

And it still wouldn't help, since in the program (lost in the depths
of followup snippage), the sizeof operator when applied to the VLA
must yield the actual number of bytes requested by the user at run
time.
 
C

Chris Dollin

Richard said:
If you could store the value of a single bit on a hydrogen atom, and if you
managed to jam the atoms so close together that there were no inter-atomic
gaps (and somehow doing this without impairing your ability to read and
write the bits), 1e32 bytes of storage would require a memory unit 700
metres long, 700 metres wide, and 700 metres tall. For comparison, the
world's tallest building, the Taipei 101 Tower, is a mere 509 metres high.

The volume of this memory unit would be over 340 million cubic metres. For
comparison, the world's largest building (by usable volume), Boeing's
Everett Plant, near Seattle, is a mere 5.64 million cubic metres.

The memory unit would have a mass of 1.33 million tonnes, which is
considerably in excess of the mass of the Palace of the Parliament in
Romania, the world's most massive building.

That's what I call Big Iron.

Your chemistry isn't so good, then ...

Recycling used memory might start a flame war.
 
K

Kenneth Brody

Andrew said:
puts("How many array elements would you like?");
char buf[32];
if(fgets(buf, sizeof buf, stdin) != NULL)
{
unsigned long n = strtoul(buf, NULL, 10);
if(n > 0)
{
int arr[n];
/* ... */

There is simply no way that the compiler can know at compile time
how many elements the user will ask for at run time.
You mean to say that your compiler won't figure out that
9,999,999,999,999,999,999,999,999,999,999 is the maximum the user can
enter and allocate an array accordingly?

I've never seen a system with that much memory. Even if it was allocating
an array of char, that's still almost 10 million million million terabytes.

(BTW, does the Standard say where variable-length arrays are stored?
Can the compiler do the equivalent of malloc/free to handle it?)

I've seen systems which can "allocate" memory without actually using
any memory until something is written to it. As long as you have at
least 103 bits of address space, one could "allocate" that much memory
on such a system, as long as you didn't try to write to all of it.
(Just as I was able to create a billion-byte file on a 1MB floppy,
many years ago.)

--
+-------------------------+--------------------+-----------------------+
| Kenneth J. Brody | www.hvcomputer.com | #include |
| kenbrody/at\spamcop.net | www.fptech.com | <std_disclaimer.h> |
+-------------------------+--------------------+-----------------------+
Don't e-mail me at: <mailto:[email protected]>
 
K

Keith Thompson

Kenneth Brody said:
(BTW, does the Standard say where variable-length arrays are stored?

No more than it says where anything is stored. VLAs are like any
other auto objects; they're created at the point of declaration, and
cease to exist at the end of their scope.
Can the compiler do the equivalent of malloc/free to handle it?)
Sure.

I've seen systems which can "allocate" memory without actually using
any memory until something is written to it. As long as you have at
least 103 bits of address space, one could "allocate" that much memory
on such a system, as long as you didn't try to write to all of it.

There's some debate about whether such systems are conforming.
(Just as I was able to create a billion-byte file on a 1MB floppy,
many years ago.)

<OT>Some filesystems optimize long sequences of 0 bytes (which can
have surprising results when you copy a file using a mechanism that
doesn't recognize the optimization).</OT>
 
K

Kenneth Brody

Keith said:
There's some debate about whether such systems are conforming.

What would be non-conforming about it? (My guess is that, given
this scenario, it's possible that malloc succeeds [ie: returns a
non-NULL value], but the program fails later when it access this
memory and the O/S discovers "oops, there's really not enough
virtual memory available to allocate".)
<OT>Some filesystems optimize long sequences of 0 bytes (which can
have surprising results when you copy a file using a mechanism that
doesn't recognize the optimization).

In this case, it's "sparse allocation". That is, it allocates space
on the disk for the sector containing the billionth byte of the file
(along with necessary overhead), without allocating physical disk
space for the intervening sectors. Writing intervening bytes, even
if they're all zeros, would have allocated physical space, AFAIK.

I (briefly) considered this for a method of copy protection, by
writing critical data throughout this "billion byte file", which
you couldn't copy via standard copy methods.


--
+-------------------------+--------------------+-----------------------+
| Kenneth J. Brody | www.hvcomputer.com | #include |
| kenbrody/at\spamcop.net | www.fptech.com | <std_disclaimer.h> |
+-------------------------+--------------------+-----------------------+
Don't e-mail me at: <mailto:[email protected]>
 
K

Keith Thompson

Kenneth Brody said:
Keith said:
There's some debate about whether such systems are conforming.

What would be non-conforming about it? (My guess is that, given
this scenario, it's possible that malloc succeeds [ie: returns a
non-NULL value], but the program fails later when it access this
memory and the O/S discovers "oops, there's really not enough
virtual memory available to allocate".)

malloc() is supposed to "allocate space for an object" or report (by
returning a null pointer) that it's unable to do so. If it returns a
non-null pointer to memory that the program won't be able to access,
it arguably didn't really allocate it.

I'll let someone else make the argument that this behavior is
conforming.

And as a practical matter, if malloc() had visibly failed, the program
could have dealt with the error; if malloc() seems to succeed, but
some later attempt to access memory blows up, there's no way to
recover.

<OT>
Note that the program that did the allocation isn't necessarily the
one that's going to be killed. If the system decides that it's
running short on resources, it might kill processes, perhaps at
random, until the remaining processes have enough resources.
</OT>

There's something to be said for allowing this kind of
over-allocation. It can improve system efficiency if programs
commonly allocate more memory than they actually use (similar to
over-booking of airline seats). I wouldn't mind having two different
allocation functions, one that says "I want this much memory; either
guarantee that I can access it, or fail", and another that says, "I'd
like this much memory; tell me you've allocated it, but it's ok to
kill me if I try to use it". I suggest calling the former function
"malloc". You can call the latter function anything you like; I won't
be using it.
 
K

Keith Thompson

tedu said:
this would make it impossible to use such constructs in a signal
handler (or other reentrant context).

Not if the implementation gets it right.

In a typical stack-based implementation, automatic objects are
allocated by advancing the stack pointer, and deallocated by setting
it back. The implementation has to do this correctly in the presence
of signal handlers.

Likewise, if VLAs are allocated by malloc, the implementation has to
free them at the right time. This might be more difficult, but I know
of no reason why it wouldn't be possible.
 
R

Richard Bos

Kenneth Brody said:
Keith said:
There's some debate about whether such systems are conforming.

What would be non-conforming about it? (My guess is that, given
this scenario, it's possible that malloc succeeds [ie: returns a
non-NULL value], but the program fails later when it access this
memory and the O/S discovers "oops, there's really not enough
virtual memory available to allocate".)

Yes. And this is anathema to solid programming and safe computer
systems.

Richard
 
B

Ben C

Kenneth Brody said:
Keith said:
I've seen systems which can "allocate" memory without actually using
any memory until something is written to it. As long as you have at
least 103 bits of address space, one could "allocate" that much memory
on such a system, as long as you didn't try to write to all of it.

There's some debate about whether such systems are conforming.

What would be non-conforming about it? (My guess is that, given
this scenario, it's possible that malloc succeeds [ie: returns a
non-NULL value], but the program fails later when it access this
memory and the O/S discovers "oops, there's really not enough
virtual memory available to allocate".)

Yes. And this is anathema to solid programming and safe computer
systems.

Nevertheless I believe this is what everyday GNU/Linux actually does--
apparently gives you the block then hits you with a SIGSEGV later when
you try to use it.

A bit like when you go to the bank to withdraw your money and they say,
sorry you can't have it because we've already lent it to other people,
four times over (real banks do not use the "Banker's Algorithm").
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,186
Messages
2,570,998
Members
47,587
Latest member
JohnetteTa

Latest Threads

Top