G
Guest
We have an web application about to go live with a customer. The app is
ASP.Net 1.1 Framework with an Oracle DB backend. We have done performance
tests on a W2K (IIS 5.0) test web server at our site (database is running on
another server). We installed in the customer site for testing on their
servers (a far bigger web server running Windows 2K3 and IIS 6.0 with a far
bigger db server). To our amazement the overall performance of the system
running on their servers was way below what we had seen on our far smaller
servers.
We eventually traced the problem to the performance of our ASP.Net
application (rather than network/hardware or database performance). We
eventually found that the time between "Begin SaveViewstate" and "End
SaveViewState" for the same page running in IIS 5 and IIS 6 was different. In
IIS 5 this part of the trace accounted for approx 5% of the time in the trace
but in IIS 6 this part of the trace accounted for 60% of the time (test page
was exactly same with same data on each test).
In a desperate move we swapped IIS 6.0 to run in IIS 5.0 isolation mode.
Suddenly performance of all our pages on the new server improved from 70
Request/sec to 330 Requests/sec.
Our app is pure ASP.Net and does not use any COM components. IIS 6 on W2K
server was set up by our customer IT staff with default settings. It is an
Intranet app and the size of Viewstate on each version of IIS test was the
same and not terribly large).
Anyone have any ideas why this should be. Our app is now faster but we would
like to understand why???
ASP.Net 1.1 Framework with an Oracle DB backend. We have done performance
tests on a W2K (IIS 5.0) test web server at our site (database is running on
another server). We installed in the customer site for testing on their
servers (a far bigger web server running Windows 2K3 and IIS 6.0 with a far
bigger db server). To our amazement the overall performance of the system
running on their servers was way below what we had seen on our far smaller
servers.
We eventually traced the problem to the performance of our ASP.Net
application (rather than network/hardware or database performance). We
eventually found that the time between "Begin SaveViewstate" and "End
SaveViewState" for the same page running in IIS 5 and IIS 6 was different. In
IIS 5 this part of the trace accounted for approx 5% of the time in the trace
but in IIS 6 this part of the trace accounted for 60% of the time (test page
was exactly same with same data on each test).
In a desperate move we swapped IIS 6.0 to run in IIS 5.0 isolation mode.
Suddenly performance of all our pages on the new server improved from 70
Request/sec to 330 Requests/sec.
Our app is pure ASP.Net and does not use any COM components. IIS 6 on W2K
server was set up by our customer IT staff with default settings. It is an
Intranet app and the size of Viewstate on each version of IIS test was the
same and not terribly large).
Anyone have any ideas why this should be. Our app is now faster but we would
like to understand why???