T
Tod Birdsall
Hi All,
The organization I am working for has created a new corporate website
that used Microsoft's Pet Shop website as their coding model, and
dynamically served up content, but cached each page by content ID. The
site appears to be working well. It is hosted on a Windows 2003 server
with 2 Gigs of RAM. It was built using Visual Studio .NET 2005 and us
running under the .NET Framework 2.0 Beta
The Issue :
When we surf the website we see a wswp.exe startup and the memory
allocated to this worker process continues to climb as more and more
people access the site. This would appear to be a normal thing, but
this memory is not noticably released.
To give you an example, we ran a test using WAPT. We had 100 users
access the site 5 seconds appart, with a delay of 7 seconds on each
page they accessed. We ran this for about 20 mins, and the w3wp.exe
allocated about 1.1 gig of RAM. This seems like a bad thing, but not
out of the norm.
To give you an example of what I mean, we ran a similar test using a
very vanilla asp.net site, and the memory allocation for the w3wp.exe
went up more slowly, but did not get released.
I ran CLR Profiler against the new corporate website, and found what
would appear to be a small memory leak that we eliminated.
System.String is allocating around 25% - 30% according to the Objects
By Address portion of the CLR Profiler report, but this seems to be the
norm, comparing it to the same CLR Profiler report for the vanilla
website mentioned above. Looking at the Objects By Address report, very
little is surviving to gen 2, so I think we can eliminate memory leads.
We are not appear to be doing any string concatination without using
StringBuilder.
My Question :
Is this the norm? Have you experienced similar results? What has been
your solution. Are Application Pools the answer? If so, what should the
settings be? Is their a document that would help us in adjusting the
settings?
Thank you in advance for taking the time to read through this post.
Tod Birdsall
http://tod1d.blogspot.com
The organization I am working for has created a new corporate website
that used Microsoft's Pet Shop website as their coding model, and
dynamically served up content, but cached each page by content ID. The
site appears to be working well. It is hosted on a Windows 2003 server
with 2 Gigs of RAM. It was built using Visual Studio .NET 2005 and us
running under the .NET Framework 2.0 Beta
The Issue :
When we surf the website we see a wswp.exe startup and the memory
allocated to this worker process continues to climb as more and more
people access the site. This would appear to be a normal thing, but
this memory is not noticably released.
To give you an example, we ran a test using WAPT. We had 100 users
access the site 5 seconds appart, with a delay of 7 seconds on each
page they accessed. We ran this for about 20 mins, and the w3wp.exe
allocated about 1.1 gig of RAM. This seems like a bad thing, but not
out of the norm.
To give you an example of what I mean, we ran a similar test using a
very vanilla asp.net site, and the memory allocation for the w3wp.exe
went up more slowly, but did not get released.
I ran CLR Profiler against the new corporate website, and found what
would appear to be a small memory leak that we eliminated.
System.String is allocating around 25% - 30% according to the Objects
By Address portion of the CLR Profiler report, but this seems to be the
norm, comparing it to the same CLR Profiler report for the vanilla
website mentioned above. Looking at the Objects By Address report, very
little is surviving to gen 2, so I think we can eliminate memory leads.
We are not appear to be doing any string concatination without using
StringBuilder.
My Question :
Is this the norm? Have you experienced similar results? What has been
your solution. Are Application Pools the answer? If so, what should the
settings be? Is their a document that would help us in adjusting the
settings?
Thank you in advance for taking the time to read through this post.
Tod Birdsall
http://tod1d.blogspot.com