O
Oleg Slyusarchuk
Hi,
This is topic that has been discussed thousand times, however.
Classic approach is to use HttpPostedFile object, check if it's not null,
and save it.
Something like:
// Check to see if file was uploaded
if( filMyFile.PostedFile != null ){
// Get a reference to PostedFile object
HttpPostedFile myFile = filMyFile.PostedFile;
// Get size of uploaded file
int nFileLen = myFile.ContentLength;
// make sure the size of the file is > 0
if( nFileLen > 0 ){
// Allocate a buffer for reading of the file
byte[] myData = new byte[nFileLen];
// Read uploaded file from the Stream
myFile.InputStream.Read(myData, 0, nFileLen);
// Create a name for the file to store
string strFilename = Path.GetFileName(myFile.FileName);
// Write data into a file
WriteToFile(Server.MapPath(strFilename), ref myData);
}
or even easier:
if (File1.PostedFile !=null)
//Checking for valid file { // PostedFile.FileName gives the entire
path.Use Substring function to rip of the filename.
string StrFileName =
File1.PostedFile.FileName.Substring(File1.PostedFile.FileName.LastIndexOf("\
\") + 1) ;
string StrFileType = File1.PostedFile.ContentType ;
int IntFileSize =File1.PostedFile.ContentLength; //Checking for the file
length
if (IntFileSize <=0) Response.Write(" Uploading of file " + StrFileName + "
failed ");
else { File1.PostedFile.SaveAs(Server.MapPath(".\\" + StrFileName));
Response.Write( "Your file " + StrFileName + " of type " + StrFileType + "
and size " + IntFileSize.ToString() + " was uploaded successfully"); } }
Everything works fine for relatively small files (say, a couple megs).
Hoverver, if files becomes big (say, 100Mb), and a couple files are being
uploded at the same time,
required memory grows, and finelly system may crash.
QUESTION.
Is there any way to read file in parts (similar to DB-based ReadChunk)
methods?
What an appropriate aproach can be used in this situation?
May I get a reference to response stream before all the file is uploaded?
Thanks,
Oleg
This is topic that has been discussed thousand times, however.
Classic approach is to use HttpPostedFile object, check if it's not null,
and save it.
Something like:
// Check to see if file was uploaded
if( filMyFile.PostedFile != null ){
// Get a reference to PostedFile object
HttpPostedFile myFile = filMyFile.PostedFile;
// Get size of uploaded file
int nFileLen = myFile.ContentLength;
// make sure the size of the file is > 0
if( nFileLen > 0 ){
// Allocate a buffer for reading of the file
byte[] myData = new byte[nFileLen];
// Read uploaded file from the Stream
myFile.InputStream.Read(myData, 0, nFileLen);
// Create a name for the file to store
string strFilename = Path.GetFileName(myFile.FileName);
// Write data into a file
WriteToFile(Server.MapPath(strFilename), ref myData);
}
or even easier:
if (File1.PostedFile !=null)
//Checking for valid file { // PostedFile.FileName gives the entire
path.Use Substring function to rip of the filename.
string StrFileName =
File1.PostedFile.FileName.Substring(File1.PostedFile.FileName.LastIndexOf("\
\") + 1) ;
string StrFileType = File1.PostedFile.ContentType ;
int IntFileSize =File1.PostedFile.ContentLength; //Checking for the file
length
if (IntFileSize <=0) Response.Write(" Uploading of file " + StrFileName + "
failed ");
else { File1.PostedFile.SaveAs(Server.MapPath(".\\" + StrFileName));
Response.Write( "Your file " + StrFileName + " of type " + StrFileType + "
and size " + IntFileSize.ToString() + " was uploaded successfully"); } }
Everything works fine for relatively small files (say, a couple megs).
Hoverver, if files becomes big (say, 100Mb), and a couple files are being
uploded at the same time,
required memory grows, and finelly system may crash.
QUESTION.
Is there any way to read file in parts (similar to DB-based ReadChunk)
methods?
What an appropriate aproach can be used in this situation?
May I get a reference to response stream before all the file is uploaded?
Thanks,
Oleg