G
glennklockwood
I'm in the process of porting some FORTRAN code to C and am running
into a problem where the program crashes if I set the size of arrays
too large. In the below code, my program runs fine if I set MAXN to
50000 or less, but at 60000 or more, the program immediately crashes
and GDB indicates the error is at the printf line.
What's strange is that I've tried this by separating my 2-dimensional
arrays into a bunch of one-dimensional arrays (eg, xc[60000][6] into
xc0[60000], xc1[60000], xc2[60000], etc) and the same sort of crashing
stopped if I commented out a few of those one-dimensional
declarations. Because of this, I know that the problem isn't with one
giant array, but the total size of the arrays I'm declaring. I assume
this all has to do with some sort of memory limit and a lot of spotty
advice posted over the internet in similar cases have suggested it has
something to do with a 64K stack size, but this doesn't make much
sense to me.
Can anyone shine some light on what exactly is going wrong and how I
can avoid it? By my calculation the size of these declared variables
is a little over 9MB which isn't a huge amount, and the FORTRAN
version of this code works for MAXN in excess of 100000.
If it matters, I am compiling on linux amd64, and this problem has
happened with both GCC 4.1 and Sun's C compiler (SS12).
Thanks much.
glenn
-- Begin code --
#define MAXN 60000
#define MAXLT 10
int main()
{
double ener[100][MAXLT], ihite[100][MAXLT];
double xc[MAXN][6], yc[MAXN][6], zc[MAXN][6];
double aij[MAXLT][MAXLT], bij[MAXLT][MAXLT], amass[MAXLT],
twopi[MAXLT][MAXLT], eta[MAXLT][MAXLT], ze[MAXLT][MAXLT],
rhop[MAXLT][MAXLT], za[MAXLT];
int ltype[MAXN], ibx[MAXN], iby[MAXN], ibz[MAXN];
char k9[16];
double etot, xl, yl, zl, zreft, deltim, temp;
int nmol, istill, nsur, jread, icalc, nbin;
int i,j;
int dblesize, intsize;
FORTRAN_OFFSET_TYPE offset;
FORTRAN_OFFSET_TYPE recsize;
FILE *fp;
int inum[MAXLT], iff;
recsize = 0;
dblesize = sizeof(xl);
intsize = sizeof(i);
printf( " What file?\n" );
.... (it crashes before the printf)
into a problem where the program crashes if I set the size of arrays
too large. In the below code, my program runs fine if I set MAXN to
50000 or less, but at 60000 or more, the program immediately crashes
and GDB indicates the error is at the printf line.
What's strange is that I've tried this by separating my 2-dimensional
arrays into a bunch of one-dimensional arrays (eg, xc[60000][6] into
xc0[60000], xc1[60000], xc2[60000], etc) and the same sort of crashing
stopped if I commented out a few of those one-dimensional
declarations. Because of this, I know that the problem isn't with one
giant array, but the total size of the arrays I'm declaring. I assume
this all has to do with some sort of memory limit and a lot of spotty
advice posted over the internet in similar cases have suggested it has
something to do with a 64K stack size, but this doesn't make much
sense to me.
Can anyone shine some light on what exactly is going wrong and how I
can avoid it? By my calculation the size of these declared variables
is a little over 9MB which isn't a huge amount, and the FORTRAN
version of this code works for MAXN in excess of 100000.
If it matters, I am compiling on linux amd64, and this problem has
happened with both GCC 4.1 and Sun's C compiler (SS12).
Thanks much.
glenn
-- Begin code --
#define MAXN 60000
#define MAXLT 10
int main()
{
double ener[100][MAXLT], ihite[100][MAXLT];
double xc[MAXN][6], yc[MAXN][6], zc[MAXN][6];
double aij[MAXLT][MAXLT], bij[MAXLT][MAXLT], amass[MAXLT],
twopi[MAXLT][MAXLT], eta[MAXLT][MAXLT], ze[MAXLT][MAXLT],
rhop[MAXLT][MAXLT], za[MAXLT];
int ltype[MAXN], ibx[MAXN], iby[MAXN], ibz[MAXN];
char k9[16];
double etot, xl, yl, zl, zreft, deltim, temp;
int nmol, istill, nsur, jread, icalc, nbin;
int i,j;
int dblesize, intsize;
FORTRAN_OFFSET_TYPE offset;
FORTRAN_OFFSET_TYPE recsize;
FILE *fp;
int inum[MAXLT], iff;
recsize = 0;
dblesize = sizeof(xl);
intsize = sizeof(i);
printf( " What file?\n" );
.... (it crashes before the printf)