T
Tim Greenfield
Hi, I have a problem I'm hoping is not too unusual. I'm trying to push a
large file (50MB) out to the client from an ASP page. The reason I'm using
an ASP page is so I can start the reading at a particular offset. This ASP
page needs to be scaleable as it needs to support thousands of requests per
hour. I've found 2 solutions so far but both have their problems:
1) Using Response.WriteFile(filename, offset, length). This seems like the
perfect solution except that it puts the entire contents of the file in
memory per request... ouch! Anyone know of a way to put the file in some
global cache so it don't fill up my RAM per session? I could afford filling
up RAM per file as there are only a handful of files that will be
downloaded.
2) Writing the file 1K at a time to the output buffer w/ buffering disabled.
This works great; each request barely takes up any memory or CPU. However,
it seems to tie up 1 precious ASP thread per request. And it turns out that
ASP threads are very precious... they only give you 25 per processor by
default. Within a minute I'm out of threads and users get a "Server To Busy"
error. I can work around this by adding:
<httpRuntime appRequestQueueLimit="50000" /> to web.config
But all it does is make the 26th user wait for someone to finish their
download before granting access instead of giving them an error. Anyone know
how to increase the size and if so, am I treading on thin ice by increasing
it to some huge number like 1000?
Another solution I've been pursuing is to try to take the best of both
worlds. If I could create a module that uses the 1K at a time technique to
avoid memory hogging but get it return immediately by using multi-threading,
I think I could solve my problem. Unfortunately, I haven't been able to get
it to work yet.
' from my .aspx file:
Dim SendFile as New SendFile
SendFile.Response = Response
SendFile.Filename = Filename
SendFile.Offset = Offset
Dim t as New Threading.Thread(AddressOf SendFile.Go)
t.Start()
Can anyone offer any wisdom? I can't imagine I'm the first one to need to do
this. Thanks!
-- Tim
large file (50MB) out to the client from an ASP page. The reason I'm using
an ASP page is so I can start the reading at a particular offset. This ASP
page needs to be scaleable as it needs to support thousands of requests per
hour. I've found 2 solutions so far but both have their problems:
1) Using Response.WriteFile(filename, offset, length). This seems like the
perfect solution except that it puts the entire contents of the file in
memory per request... ouch! Anyone know of a way to put the file in some
global cache so it don't fill up my RAM per session? I could afford filling
up RAM per file as there are only a handful of files that will be
downloaded.
2) Writing the file 1K at a time to the output buffer w/ buffering disabled.
This works great; each request barely takes up any memory or CPU. However,
it seems to tie up 1 precious ASP thread per request. And it turns out that
ASP threads are very precious... they only give you 25 per processor by
default. Within a minute I'm out of threads and users get a "Server To Busy"
error. I can work around this by adding:
<httpRuntime appRequestQueueLimit="50000" /> to web.config
But all it does is make the 26th user wait for someone to finish their
download before granting access instead of giving them an error. Anyone know
how to increase the size and if so, am I treading on thin ice by increasing
it to some huge number like 1000?
Another solution I've been pursuing is to try to take the best of both
worlds. If I could create a module that uses the 1K at a time technique to
avoid memory hogging but get it return immediately by using multi-threading,
I think I could solve my problem. Unfortunately, I haven't been able to get
it to work yet.
' from my .aspx file:
Dim SendFile as New SendFile
SendFile.Response = Response
SendFile.Filename = Filename
SendFile.Offset = Offset
Dim t as New Threading.Thread(AddressOf SendFile.Go)
t.Start()
Can anyone offer any wisdom? I can't imagine I'm the first one to need to do
this. Thanks!
-- Tim