ServerXMLHTTP uses 100% CPU for a long time

E

Ed McNierney

I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to retrieve
large binary data from a remote server. When the request is large (more than
a few megabytes), the ServerXMLHTTP page jumps to nearly 100% CPU utilization
for an unusually long time. The remote server needs a few seconds to prepare
the request, during which time the CPU seems OK. It seems that as soon as
the data is ready to retrieve, the CPU usage jumps and remains that way until
the data has all been copied to the requesting server. That takes way too
long - about 35 seconds when requesting a 12 MB file over a gigabit Ethernet.

I use ServerXMLHTTP hundreds of thousands of times daily on this same system
on the same network, with absolutely no problem - but for smaller requests.
There's something about the size of the request that makes it blow up.

I saw some reports of older systems with this problem (Windows 2000), but
I'm running IIS 6 on Windows Server 2003, SP1. Thanks!
 
B

Bob Barrows [MVP]

Ed said:
I'm trying to use ServerXMLHTTP on an ASP (not ASP.NET) page to
retrieve large binary data from a remote server. When the request is
large (more than a few megabytes), the ServerXMLHTTP page jumps to
nearly 100% CPU utilization for an unusually long time. The remote
server needs a few seconds to prepare the request, during which time
the CPU seems OK. It seems that as soon as the data is ready to
retrieve, the CPU usage jumps and remains that way until the data has
all been copied to the requesting server. That takes way too long -
about 35 seconds when requesting a 12 MB file over a gigabit
Ethernet.

I use ServerXMLHTTP hundreds of thousands of times daily on this same
system on the same network, with absolutely no problem - but for
smaller requests. There's something about the size of the request
that makes it blow up.

I saw some reports of older systems with this problem (Windows 2000),
but I'm running IIS 6 on Windows Server 2003, SP1. Thanks!

Reminds me of the oldie, but goodie:

Patient: Doctor, it hurts when I raise my arm
Doctor: So stop raising your arm!
;-)

Sounds to me as if a different technology is needed for this - perhaps FTP?
Bob Barrows
 
E

Ed McNierney

Bob -

Thanks for the quick reply!

First, I'd like to understand the problem, not ignore it. That won't get it
fixed.

Second, I don't have an option of a different technology. The service that
is producing these files (they're images, produced on the fly based on an
HTTP request) serves them via an HTTP interface, not FTP or any other.

I did a lot of searching and cannot find any other example of this problem
(other than old ones). The "alternative technology" available to me is to
move this portion of the site to a Linux server, where my older PHP code
works flawlessly. The intent was to move the entire site to Windows, but if
Windows can't cut it, I'll need to stick to Linux.
 
B

Bob Barrows [MVP]

From
http://support.microsoft.com/default.aspx?scid=/servicedesks/webcasts/wc052802/WCT052802.asp:

.... Another limitation, which we touched on earlier, is that WinInet doesn't
handle some of the higher-level content-related services with regard to HTTP
data. Some of those things are handled by URLMON. Particularly, URLMON
implements MIME type detection and implements HTTP compression.
HTTP compression is a unique technology on your server that says, "Please
gzip this data, compress it before it gets sent to the client." The client
sees it, sees the header indicating that it's gzipped content, and
decompresses it before displaying. If you have a large amount of content
you're sending, then the cost of performing this compression and
decompression can be much less than the cost of transmitting the
uncompressed content down from the server to the client. However, this is
implemented at the URLMON level. Because ServerXMLHTTP doesn't use URLMON,
it goes through WinHTTP, it uses a more bare-bones interface, it can't
handle HTTP compression and, again, there is no MIME type detection at all.
Use that at your own risk and your own best judgment.

However, according to this:
http://groups.google.com/group/micr...MLHTTP+100%+CPU&rnum=4&hl=en#6c4482f75218b1b1

There is a known performance issue that sas fixed in SP3 for MSXML 3.0

What version of MSXML are you using?
 
E

Ed McNierney

Bob -

Thanks again for the quick replies. There is no HTTP compression involved,
and I was running on MSXML 4.0 and then upgraded to MSXML 6.0 to see if this
bug was fixed. There was no apparent difference in behavior between 4.0 and
6.0.

I did read the item you mention about the MSXML 3.0 bug because the symptom
sounds virtually identical. But I have found no mention of a similar bug in
4.0 or 6.0, which I would have expected if there was regression from 3.0
(e.g. if the SP3 bug fix never made it to 4.0).

- Ed
 
B

Bob Barrows [MVP]

I think what he was saying is that with URLMon, http compression is
automatically used, reducing the download time. With WinInet, it can't be
used.

Otherwise, I am out of my depth there. You may want to try the
..inetserver.iis group (or even one of the xml groups) if nobody else steps
up to the plate here.

If you do get a solution, I would appreciate hearing about it.

Bob
 
E

Egbert Nierop \(MVP for IIS\)

Ed McNierney said:
Bob -

Thanks again for the quick replies. There is no HTTP compression
involved,
and I was running on MSXML 4.0 and then upgraded to MSXML 6.0 to see if
this
bug was fixed. There was no apparent difference in behavior between 4.0
and
6.0.

I did read the item you mention about the MSXML 3.0 bug because the
symptom
sounds virtually identical. But I have found no mention of a similar bug
in
4.0 or 6.0, which I would have expected if there was regression from 3.0
(e.g. if the SP3 bug fix never made it to 4.0).

Hi Ed,

You can use some alternatives.
ADODB.Record
and ADODB.Stream can use http uploads and logon to remote pages.

Aditionally, you always should send large data, chunked, in loops, say
blocks of data of 4096 KB. Within the loop, you test for connectivity
issues.
 
B

Bob Barrows [MVP]

Egbert said:
Hi Ed,

You can use some alternatives.
ADODB.Record
and ADODB.Stream can use http uploads and logon to remote pages.

Aditionally, you always should send large data, chunked, in loops, say
blocks of data of 4096 KB. Within the loop, you test for connectivity
issues.

Both good suggestions. I wish I had thought of making them.

Bob
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,228
Members
46,818
Latest member
SapanaCarpetStudio

Latest Threads

Top