S
Simon Wadsworth
My application uses VB6 WebClasses to handle the UI, so all requests come in
via a stub ASP page.
I would like to know the time taken for the request to be processed. I am
trying to use the time-taken value in the IIS log files by I am unclear as to
the precise meaning of the value recorded.
Using a test WebClass on a development PC (Win2K Pro/SP4) the value recorded
in the IIS log files seems to vary:
1. If Response.Buffer=True, then the time recorded is the amount of time
spent in the WebClass call.
2. If Response.Buffer=False, then the time recorded is the amount of time
spent in the WebClass call PLUS the amount of time required to send the
response data to the client browser.
However, on our production servers the Response.Buffer=True is always used
but the times recorded seen to indicate that they include the data transfer
time as well. i.e. low bandwith client accesses have a larger time-taken than
local LAN users for an equivalent volume of data.
Additionally, I was my understanding that when Buffering was used with an
ASP page the sc-bytes value was not recorded. Again, on the production server
this value IS being recorded and seems accurate.
The production servers are W2K AS/SP4 using HTTPS, Basic Authentication and
Certificates.
Does anyone have any definitive information on this?
Many Thanks
via a stub ASP page.
I would like to know the time taken for the request to be processed. I am
trying to use the time-taken value in the IIS log files by I am unclear as to
the precise meaning of the value recorded.
Using a test WebClass on a development PC (Win2K Pro/SP4) the value recorded
in the IIS log files seems to vary:
1. If Response.Buffer=True, then the time recorded is the amount of time
spent in the WebClass call.
2. If Response.Buffer=False, then the time recorded is the amount of time
spent in the WebClass call PLUS the amount of time required to send the
response data to the client browser.
However, on our production servers the Response.Buffer=True is always used
but the times recorded seen to indicate that they include the data transfer
time as well. i.e. low bandwith client accesses have a larger time-taken than
local LAN users for an equivalent volume of data.
Additionally, I was my understanding that when Buffering was used with an
ASP page the sc-bytes value was not recorded. Again, on the production server
this value IS being recorded and seems accurate.
The production servers are W2K AS/SP4 using HTTPS, Basic Authentication and
Certificates.
Does anyone have any definitive information on this?
Many Thanks