S
Simon Andrews
I've got a class which sends multipart MIME data as a POST request to a
web server. It works fine with small files, but larger files (still
<50Mb) cause it to throw an OutOfMemory exception:
Exception in thread "Thread-0" java.lang.OutOfMemoryError: Java heap space
I've looked through the relevant code and can't see anywhere that I'm
caching the data being sent, and I really can't see why the memory
consumption would be related to file size.
I've found that if I don't write the file data to the OutputStream
(comment out outStream.write(bytes,0,a) then it completes OK so it
looks like the OutputStream is caching something - but I can't see any
option to control this. I've tried flushing after each write, but with
no effect.
A cut down version of the relevant bit of code is below. Any clues as
to how to debug this are most welcome
Cheers
Simon.
import java.io.*;
import java.net.*;
public class TestFileUpload {
public static void main(String[] args) {
OutputStream outStream = null;
try {
HttpURLConnection h = (HttpURLConnection)new
URL("http://localhost/cgi-bin/test.cgi").openConnection();
h.setAllowUserInteraction(false);
h.setRequestMethod("POST");
h.setDoOutput(true);
h.setUseCaches(false);
h.connect();
outStream = h.getOutputStream();
}
catch (Exception e) {
e.printStackTrace();
}
int byteCount = 0;
FileInputStream fi = null;
try {
fi = new FileInputStream(new File("C:/big.xml"));
}
catch (FileNotFoundException e) {
e.printStackTrace();
}
DataInputStream di = new DataInputStream(fi);
byte [] bytes = new byte[1024];
try {
int a;
while ((a = di.read(bytes)) >0) {
outStream.write(bytes,0,a);
byteCount+=a;
}
outStream.flush();
outStream.close();
}
catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
web server. It works fine with small files, but larger files (still
<50Mb) cause it to throw an OutOfMemory exception:
Exception in thread "Thread-0" java.lang.OutOfMemoryError: Java heap space
I've looked through the relevant code and can't see anywhere that I'm
caching the data being sent, and I really can't see why the memory
consumption would be related to file size.
I've found that if I don't write the file data to the OutputStream
(comment out outStream.write(bytes,0,a) then it completes OK so it
looks like the OutputStream is caching something - but I can't see any
option to control this. I've tried flushing after each write, but with
no effect.
A cut down version of the relevant bit of code is below. Any clues as
to how to debug this are most welcome
Cheers
Simon.
import java.io.*;
import java.net.*;
public class TestFileUpload {
public static void main(String[] args) {
OutputStream outStream = null;
try {
HttpURLConnection h = (HttpURLConnection)new
URL("http://localhost/cgi-bin/test.cgi").openConnection();
h.setAllowUserInteraction(false);
h.setRequestMethod("POST");
h.setDoOutput(true);
h.setUseCaches(false);
h.connect();
outStream = h.getOutputStream();
}
catch (Exception e) {
e.printStackTrace();
}
int byteCount = 0;
FileInputStream fi = null;
try {
fi = new FileInputStream(new File("C:/big.xml"));
}
catch (FileNotFoundException e) {
e.printStackTrace();
}
DataInputStream di = new DataInputStream(fi);
byte [] bytes = new byte[1024];
try {
int a;
while ((a = di.read(bytes)) >0) {
outStream.write(bytes,0,a);
byteCount+=a;
}
outStream.flush();
outStream.close();
}
catch (IOException ioe) {
ioe.printStackTrace();
}
}
}