array size (?)

M

mike

hi,

i am running a very big code on my pc using redhat linux.
when i try to increase my array size, compile and run, i get
segmentation fault. i go into the debugger, run it, it crashes
right away. i can't even trace where it happens.
i have no idea what is the problem. any idea ???
 
B

Ben Pfaff

i am running a very big code on my pc using redhat linux.
when i try to increase my array size, compile and run, i get
segmentation fault. i go into the debugger, run it, it crashes
right away. i can't even trace where it happens.
i have no idea what is the problem. any idea ???

You probably screwed up memory allocation or access. Try a
memory debugger like valgrind.
 
J

j

mike said:
hi,

i am running a very big code on my pc using redhat linux.
when i try to increase my array size, compile and run, i get
segmentation fault. i go into the debugger, run it, it crashes
right away. i can't even trace where it happens.
i have no idea what is the problem. any idea ???

If you are unable to trace, you have probably corrupted your stack
by using an array with automatic storage duration that is too large.

What is the size of your array? Under c89 the size of the largest object
that can be created is 32767 bytes and 65535 bytes under c99.
 
E

Emmanuel Delahaye

In said:
hi,

i am running a very big code on my pc using redhat linux.
when i try to increase my array size, compile and run, i get
segmentation fault. i go into the debugger, run it, it crashes
right away. i can't even trace where it happens.
i have no idea what is the problem. any idea ???

I guess it's a local array. This happens. The C language isn't clear about
local variables size limitations. Sounds to be an implementation issue.

The naive way : add 'static' before the definition of the array.
The better way : use dynamic allocation (malloc() / free())
 
K

Keith Thompson

j said:
If you are unable to trace, you have probably corrupted your stack
by using an array with automatic storage duration that is too large.

What is the size of your array? Under c89 the size of the largest object
that can be created is 32767 bytes and 65535 bytes under c99.

Those are minimum maxima. A C89 implementation is allowed to limit
object sizes to 32767 bytes, but it can supporter larger objects (and
most implementations do); likewise for C99 and 65535.

Some operating systems may provide ways to adjust memory allocation
limits. There may also be different limits for declared objects
vs. heap-allocated objects.

(Since this is comp.lang.c, not comp.unix.programmer, it would be
inappropriate to mention the "limit" or "ulimit" command, so I won't.)
 
E

E. Robert Tisdale

mike said:
I am running a very big code on my pc using RedHat Linux.
When I try to increase my array size, compile and run, I get
segmentation fault. I go into the debugger, run it, it crashes
right away. I can't even trace where it happens.
I have no idea what is the problem. Any idea?
> cat main.c
#include <stdio.h>
#include <stdlib.h>

int main(int argc, char* argv[]) {
char a[16*1024*1024];
const
size_t n = 16*1024*1024;
for (size_t j = 0; j < n; ++j)
a[j] = '\0';
fprintf(stdout, "Hello world!\n");
return 0;
}
> gcc -Wall -std=c99 -pedantic -o main main.c
> ./main
Segmentation fault (core dumped)
> rm core.30147
> limit stacksize
stacksize 10240 kbytes
> limit stacksize 20240
> limit stacksize
stacksize 20240 kbytes
Hello world!
> gcc --version
gcc (GCC) 3.3.3 20040412 (Red Hat Linux 3.3.3-7)
 
E

E. Robert Tisdale

E. Robert Tisdale said:
mike said:
I am running a very big code on my pc using RedHat Linux.
When I try to increase my array size, compile and run, I get
segmentation fault. I go into the debugger, run it, it crashes
right away. I can't even trace where it happens.
I have no idea what is the problem. Any idea?

cat main.c
#include <stdio.h>
#include <stdlib.h>

int main(int argc, char* argv[]) {
char a[16*1024*1024];
const
size_t n = 16*1024*1024;
for (size_t j = 0; j < n; ++j)
a[j] = '\0';
fprintf(stdout, "Hello world!\n");
return 0;
}
gcc -Wall -std=c99 -pedantic -o main main.c
./main
Segmentation fault (core dumped)
rm core.30147
limit stacksize stacksize 10240 kbytes
limit stacksize 20240
limit stacksize
stacksize 20240 kbytes
> ./main
Hello world!
gcc --version
gcc (GCC) 3.3.3 20040412 (Red Hat Linux 3.3.3-7)
 
J

j

Keith Thompson said:
Those are minimum maxima. A C89 implementation is allowed to limit
object sizes to 32767 bytes, but it can supporter larger objects (and
most implementations do); likewise for C99 and 65535.

Sorry. I meant to include the word ``portably'' after ``created''.
Some operating systems may provide ways to adjust memory allocation
limits. There may also be different limits for declared objects
vs. heap-allocated objects.

(Since this is comp.lang.c, not comp.unix.programmer, it would be
inappropriate to mention the "limit" or "ulimit" command, so I won't.)
 
M

mike

j said:
If you are unable to trace, you have probably corrupted your stack
by using an array with automatic storage duration that is too large.

What is the size of your array? Under c89 the size of the largest object
that can be created is 32767 bytes and 65535 bytes under c99.

i use C programming to do simulations (solving eqns), so my knowledge of
the language is limited to what my needs are. i needed to mention this so
that you guys don't talk pass me.

ok...i declare all my variables with static array sizes. (a while back,
i did try the dynamic allocations as Emmanuel suggested below, but since there
is a huge number of variables with O(1e5-1e6) degrees of freedom with
complicate relationships, i decided to abandon this). one of my variables:

double q[2650000][10][5] -> this is fine, but when i try:
double q[4530000][10][5] -> here is the error i got from the ddd debugger:

Program received signal SIGSEGV, Segmentation fault.
0x40000be0 in q ()

you guys are right, it looks like the memory is messed up.
so there is a limit on the array size?
 
M

mike

j said:
If you are unable to trace, you have probably corrupted your stack
by using an array with automatic storage duration that is too large.

What is the size of your array? Under c89 the size of the largest object
that can be created is 32767 bytes and 65535 bytes under c99.

i use C programming to do simulations (solving eqns), so my knowledge of
the language is limited to what my needs are. i needed to mention this so
that you guys don't talk pass me.

ok...i declare all my variables with static array sizes. (a while back,
i did try the dynamic allocations as Emmanuel suggested below, but since there
is a huge number of variables with O(1e5-1e6) degrees of freedom with
complicate relationships, i decided to abandon this). one of my variables:

double q[2650000][10][5] -> this is fine, but when i try:
double q[4530000][10][5] -> here is the error i got from the ddd debugger:

Program received signal SIGSEGV, Segmentation fault.
0x40000be0 in q ()

you guys are right, it looks like the memory is messed up.
so there is a limit on the array size?
 
K

Keith Thompson

ok...i declare all my variables with static array sizes. (a while back,
i did try the dynamic allocations as Emmanuel suggested below, but
since there is a huge number of variables with O(1e5-1e6) degrees of
freedom with complicate relationships, i decided to abandon this).
one of my variables:

double q[2650000][10][5] -> this is fine, but when i try:
double q[4530000][10][5] -> here is the error i got from the ddd debugger:

Program received signal SIGSEGV, Segmentation fault.
0x40000be0 in q ()

you guys are right, it looks like the memory is messed up.
so there is a limit on the array size?

Assuming sizeof(double) is 8, the first declaration gives you an array
whose size is nearly one gigabyte; the second is nearly 1.7 gigabytes.

If that's just *one* of your variables, you're dealing with a whole
lot of data, and I'm not surprised that your OS doesn't let you
allocate it all at once.

You may be able to increase the limit using the "limit" or "ulimit"
command (please don't ask for further details here; if you can't find
documentation, try a newsgroup appropriate to your operating system).
But you should consider whether it's all going to fit into physical
memory anyway. You might have better luck using files.
 
I

Igmar Palsenberg

mike said:
i am running a very big code on my pc using redhat linux.
when i try to increase my array size, compile and run, i get
segmentation fault. i go into the debugger, run it, it crashes
right away. i can't even trace where it happens.
i have no idea what is the problem. any idea ???

The large stack-based allocation screws up your stack. Shoving large
arrays on the stack is a bad idea if you ask me. Use malloc instead of a
static allocated array.



Igmar
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,145
Messages
2,570,824
Members
47,370
Latest member
desertedtyro29

Latest Threads

Top