Redirecting a big file with Response.BinaryWrite

T

twiggy182

Hi,

I really need you help because I'm not very familliar with ASP and I
could not find any solution to my problem.

To put you in situation, I have a CGI to which I send a file name, and
that script return me that file. But for security reason, I don't want
to publish the address of this CGI, so I encapsulated it in an ASP
file. This way, only this ASP file knows where the CGI is.

Here is my code :
<%
Server.ScriptTimeout = 1000 * 1000
url = "http://address.to.cgi?search_file=" & request("search_file")

set xmlhttp = CreateObject("MSXML2.ServerXMLHTTP")
ResolveTimeout = 5 * 1000
ConnectTimeout = 5 * 1000
SendTimeout = 60 * 1000
ReceiveTimeout = 60 * 1000
xmlhttp.setTimeouts ResolveTimeout, ConnectTimeout, SendTimeout,
ReceiveTimeout

xmlhttp.open "GET", url, false
xmlhttp.send ""
respContentType = xmlhttp.getResponseHeader("content-type")
Response.ContentType = xmlhttp.getResponseHeader("content-type")
Response.AddHeader "Content-Disposition",
xmlhttp.getResponseHeader("content-disposition")

Response.binaryWrite xmlhttp.responseBody

set xmlhttp = nothing
%>

The problem is that each time I try to copy a file bigger than 20Mb, I
get a "The page cannot be displayed". My script is failing at the
"Response.binaryWrite" because if I replace it by ' Response.write
"Hello World" ' , it works.

I try to break down the xmlhttp.responseBody and send it in many chunk
but I don't know how to do that :(

Any suggestion?

Thanks in advance
 
A

Anthony Jones

twiggy182 said:
Hi,

I really need you help because I'm not very familliar with ASP and I
could not find any solution to my problem.

To put you in situation, I have a CGI to which I send a file name, and
that script return me that file. But for security reason, I don't want
to publish the address of this CGI, so I encapsulated it in an ASP
file. This way, only this ASP file knows where the CGI is.

Here is my code :
<%
Server.ScriptTimeout = 1000 * 1000
url = "http://address.to.cgi?search_file=" & request("search_file")

set xmlhttp = CreateObject("MSXML2.ServerXMLHTTP")
ResolveTimeout = 5 * 1000
ConnectTimeout = 5 * 1000
SendTimeout = 60 * 1000
ReceiveTimeout = 60 * 1000
xmlhttp.setTimeouts ResolveTimeout, ConnectTimeout, SendTimeout,
ReceiveTimeout

xmlhttp.open "GET", url, false
xmlhttp.send ""
respContentType = xmlhttp.getResponseHeader("content-type")
Response.ContentType = xmlhttp.getResponseHeader("content-type")
Response.AddHeader "Content-Disposition",
xmlhttp.getResponseHeader("content-disposition")

Response.binaryWrite xmlhttp.responseBody

set xmlhttp = nothing
%>

The problem is that each time I try to copy a file bigger than 20Mb, I
get a "The page cannot be displayed". My script is failing at the
"Response.binaryWrite" because if I replace it by ' Response.write
"Hello World" ' , it works.

I try to break down the xmlhttp.responseBody and send it in many chunk
but I don't know how to do that :(

Any suggestion?


You may be hitting the buffer limit on the server. How big a resource are
you expecting to send? Is it possible to increase the buffer limit? Try
Response.Write UBound(xmlhttp.responseBody) this will confirm that the CGI
get has been successful and the actual size of the resource being fetched.

If it's not possible to increase the buffer limit then one alternative might
be to use and ADODB.Stream object:-

'Warning: air code.

Const clChunkSize = 1048576 ' 1MB
Response.Buffer = False

' Set response headers here

Dim oStream : Set oStream = Server.CreateObject("ADODB.Stream")

oStream.Type = 1 ' Binary
oStream.open
oStream.write xmlhttp.responseBody
Set xmlhttp = Nothing ' Hopefully this will release some significant memory
oStream.position = 0

Dim i
For i = 0 To oStream.Size \ clChunkSize
Response.BinaryWrite oStream.Read(clChunkSize)
Next
oStream.close
 
T

twiggy182

Hi Anthony,

thank you so much! Your air code is working like a charm!

Actually, I tried almost the same code yesterday but I was getting a
"Type Mismatch" exception when I was calling the "Response.BinaryWrite"
function. Could be because I was calling "Response.flush" in the
loop...

Anyway, thanks again. And for your information ( and maybe for other
persons facing this problem ), I was trying to copy files of 40Mb. I
first had to increase the "AspBufferingLimit" to something around 50Mb
(In IIS). This fixed the buffer exception I was getting. At that point,
I was able to download files smaller than 20Mb only ( go figure why 20
). This new code seems to work around this limit.

Another interesting point, I was facing this problem on IIS 6 running
on Win2000 server. But on my desktop with IIS 5, it was working
alright, even if an external PC connect to my desktop.

OK, Merry Christmas ;)

Bye

Anthony Jones a écrit :
 
T

twiggy182

Hi,

I'm now checking if it would be possible to enhance my code because
right now, it takes double the time to download a file ( copy from CGI
to ASP + copy from ASP to User ).

So is it possible to have a stream from the xmlhttp object? Basically,
I want a loop that extract 1Mb from "xmlhttp", and then push it to the
"Response".

Is it possible?
Thanks :)
 
A

Anthony Jones

twiggy182 said:
Hi,

I'm now checking if it would be possible to enhance my code because
right now, it takes double the time to download a file ( copy from CGI
to ASP + copy from ASP to User ).

So is it possible to have a stream from the xmlhttp object? Basically,
I want a loop that extract 1Mb from "xmlhttp", and then push it to the
"Response".

Is it possible?

No. Data isn't available from XMLHTTP until the entire response from the
remote server has been received. I've even tried using the underlying
WinHTTP component which in async mode does deliver events as data is
received, however on trying to forward that to the client I found that
chunks of data we're being lost. I suspect that this may be due to WinHTTP
not being re-entrant, IOW using it in this way asychronously is fine so long
as you don't make other WinHTTP calls during the data received events.

The only other option that might be available if the remote resource would
accept byte range headers. You could then use xmlhttp in a loop pulling
chunks of the resource at a time. It's pretty hairy stuff and is highly
unlikely to work on a resource supplied by CGI.

Is't possible to mitigate the problem by caching the response on your
server?
 
T

twiggy182

Hi,

thanks for the information. We will have to live with average
performances...

As for caching stuff on the server, it would work because there are too
many different request, and performance gain would be on rare requests.

Thanks again!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,990
Messages
2,570,211
Members
46,796
Latest member
SteveBreed

Latest Threads

Top