L
lennyw
Hi
I've tried an experimental app under Eclipse, that parses / transforms
a large (96Mb) xml file using the java XSLT library DOM parser and
then tried the SAX parser. The program runs fine with smaller XML
input files but exits with OutOfMemory with the large input file.
It seemed odd to me that I would get an out of memory because I'm set
up with quite a bit of physical memory.
My setup is:
WinXP Pro
768 Mb Ram
Eclipse -vmargs -Xmx1628M (biggest I can set it for, and still have it
load)
jvm 1.4.2_11
WinXP Virtual memory setup: Min 1149Mb Max 4095 (also tried "Let system
manage virtual memory")
It's a not heavily loaded desktop computer running little more than the
OS, an antivirus program and Eclipse.
To get a better idea of what's going on, I tried making a simple heap
test program
(below),which allocates in units of 1,0000,000 bytes until it crashes.
It consistently crashes with the array index i = 32. I find the value
32
to be a little suspicious, but looking through the various properties
of the test programs "project" parameters in Eclipse, I don't find any
limitations.
Can you give me some specific things to try or a general strategy I can
use
to isolate the cause of this problem?
Thanks in advance for any help.
Lenny Wintfeld
ps - here's the test program
public class TestHeapOverflow {
public static void main(String[] args) {
char [] [] arrayArray = new char[2000][];
int i = 0;
try {
for (i = 0; i<2000; i++)
{
arrayArray = new char[1000000];
}
} catch (RuntimeException e) {
e.printStackTrace();
}
finally{System.out.println("i="+i);}
}
}
I've tried an experimental app under Eclipse, that parses / transforms
a large (96Mb) xml file using the java XSLT library DOM parser and
then tried the SAX parser. The program runs fine with smaller XML
input files but exits with OutOfMemory with the large input file.
It seemed odd to me that I would get an out of memory because I'm set
up with quite a bit of physical memory.
My setup is:
WinXP Pro
768 Mb Ram
Eclipse -vmargs -Xmx1628M (biggest I can set it for, and still have it
load)
jvm 1.4.2_11
WinXP Virtual memory setup: Min 1149Mb Max 4095 (also tried "Let system
manage virtual memory")
It's a not heavily loaded desktop computer running little more than the
OS, an antivirus program and Eclipse.
To get a better idea of what's going on, I tried making a simple heap
test program
(below),which allocates in units of 1,0000,000 bytes until it crashes.
It consistently crashes with the array index i = 32. I find the value
32
to be a little suspicious, but looking through the various properties
of the test programs "project" parameters in Eclipse, I don't find any
limitations.
Can you give me some specific things to try or a general strategy I can
use
to isolate the cause of this problem?
Thanks in advance for any help.
Lenny Wintfeld
ps - here's the test program
public class TestHeapOverflow {
public static void main(String[] args) {
char [] [] arrayArray = new char[2000][];
int i = 0;
try {
for (i = 0; i<2000; i++)
{
arrayArray = new char[1000000];
}
} catch (RuntimeException e) {
e.printStackTrace();
}
finally{System.out.println("i="+i);}
}
}