Improving ASP.NET Performance Part 8: Caching

In this article we’ll discuss caching. When used correctly, caching can avoid unnecessary database lookups and other expensive operations. You can also reduce latency. The ASP.NET cache is a simple, scalable, in-memory caching service provided to ASP.NET applications. It provides a time-based expiration facility, and it also tracks dependencies on external files, directories, or other cache keys. It also provides a mechanism to invoke a callback function when an item expires in the cache. The cache automatically removes items based on a least recently used (LRU) algorithm, a configured memory limit, and the CacheItemPriority enumerated value of the items in the cache. Cached data is also lost when your application or worker process recycles.

ASP.NET provides three caching techniques that we’ll summarize in the sections below.

Cache API

Use the cache API to programmatically cache application-wide data that is shared and accessed by multiple users. The cache API is also a good place for data that you need to manipulate in some way before you present the data to the user. This includes data such as strings, arrays, collections, and data sets.

A good usage example of implementing the Cache API is a product catalog. Product catalogs are good candidates because the data typically needs to be updated at specific intervals, shared across the application, and manipulated before sending the content to the client.

You should avoid using the cache API in the following circumstances:

·         The data you are caching is user-specific. Consider using session state instead.

·         The data is updated in real time.

·         Your application is already in production, and you do not want to update the code base. In this case, consider using output caching.

The cache API permits you to insert items in the cache that have a dependency upon external conditions. Cached items are automatically removed from the cache when the external conditions change. You use this feature by using a third parameter on the Cache.Insert method that accepts an instance of a CacheDependency class. The CacheDependency class has eight different constructors that support various dependency scenarios. These constructors include file-based, time-based, and priority-based dependencies, together with dependencies that are based on existing dependencies.

You can also run code before serving data from the cache. For example, you might want to serve cached data for certain customers, but for others you might want to serve data that is updated in real time. You can perform this type of logic by using the HttpCachePolicy.AddValidationCallback method.

Output Caching

The output cache enables you to cache the contents of entire pages for a specific duration of time. It enables you to cache multiple variations of the page based on query strings, headers, and userAgent strings. The output cache also enables you to determine where to cache the content, for example on a proxy, server, or a client. Like the cache API, output caching enables you to save time retrieving data. Output caching also saves time rendering content. You should enable output caching on dynamically generated pages that do not contain user-specific data in scenarios where you do not need to update the view on every request.

An example when you would want to use the output cache is reporting. Reports that contain a low number of variations are good candidates because you save time by not retrieving and processing the data each time the page is accessed.

Avoid using output caching in the following circumstances:

·         You need programmatic access to the data on your page. Consider using the cache API instead.

·         The number of page variants becomes too large.

·         The page contains a mixture of static, dynamic, and user-specific data. Consider using fragment caching instead.

·         The page contains content that must be refreshed with every view.

Partial Page or Fragment Caching

Partial page or fragment caching is a subset of output caching. It includes an additional attribute that allows you to cache a variation based on the properties of the user control (.ascx file.)

Fragment caching is implemented by using user controls in conjunction with the @OutputCache directive. Use fragment caching when caching the entire content of a page is not practical. If you have a mixture of static, dynamic, and user-specific content in your page, partition your page into separate logical regions by creating user controls. These user controls can then be cached, independent of the main page, to reduce processing time and to increase performance.

An example of a scenario that is a good candidate for fragment caching is navigation menus. Navigation menus that are not user-specific are great candidates because menus are usually rendered with each request and are often static.

You should avoid using fragment caching under the following conditions:

·         The number of page variants becomes too large.

·         The cached user controls contain content that must be refreshed with every view.

If your application uses the same user control on multiple pages, make the pages share the same instance by setting the Shared attribute of the user control @ OutputCache directive to true. This can save a significant amount of memory.

Caching Guidelines

In the sections below we’ll discuss the best practices guidelines when designing a caching strategy.

Separate Dynamic Data from Static Data in Your Pages

Partial page caching enables you to cache parts of a page by using user controls. Use user controls to partition your page. For example, consider the following simple page which contains static, dynamic, and user-specific information.





<tr><td colspan=3>Application Header – Welcome John Smith</td></tr>

<tr><td>Menu</td><td>Dynamic Content</td><td>Advertisments</td></tr>

<tr><td colspan=3>Application Footer</td></tr>



You can partition and cache this page by using the following code:


<%@ Register TagPrefix=”app” TagName=”header” src=”header.ascx” %>

<%@ Register TagPrefix=”app” TagName=”menu” src=”menu.ascx” %>

<%@ Register TagPrefix=”app” TagName=”advertisements”

src=”advertisements.ascx” %>

<%@ Register TagPrefix=”app” TagName=”footer” src=”footer.ascx” %>




<tr><td colspan=3><app:header runat=server /></td></tr>

<tr><td><app:menu runat=server /></td><td>Dynamic

Content</td><td><app:advertisements runat=server /></td></tr>

<tr><td colspan=3><app:footer runat=server /></td></tr>




<%@Control %>

Application Header – Welcome <% GetName() %>



<%@Control %>

<%@ OutputCache Duration=”30″ VaryByParam=”none” %>



<%@Control %>

<%@ OutputCache Duration=”30″ VaryByParam=”none” %>



<%@Control %>

<%@ OutputCache Duration=”60″ VaryByParam=”none” %>


By partitioning the content, as shown in the sample, you can cache selected portions of the page to reduce processing and rendering time.

Configure the Memory Limit

Configuring and tuning the memory limit is critical for the cache to perform optimally. The ASP.NET cache starts trimming the cache based on a LRU algorithm and the CacheItemPriority enumerated value assigned to the item after memory consumption is within 20 percent of the configured memory limit. If the memory limit is set too high, it is possible for the process to be recycled unexpectedly. Your application might also experience out-of-memory exceptions. If the memory limit is set too low, it could increase the amount of time spent performing garbage collections, which decreases overall performance.

Empirical testing shows that the likelihood of receiving out-of-memory exceptions increases when private bytes exceed 800 megabytes (MB). A good rule to follow when determining when to increase or decrease this number is that 800 MB is only relevant for .NET Framework 1.0. If you have .NET Framework 1.1 and if you use the /3 GB switch, you can go up to 1,800 MB.

When using the ASP.NET process model, you configure the memory limit in the Machine.config file as follows.

<processModel memoryLimit=”50″>

This value controls the percentage of physical memory that the worker process is allowed to consume. The process is recycled if this value is exceeded. In the previous sample, if there are 2 gigabytes (GB) of RAM on your server, the process recycles after the total available physical RAM falls below 50 percent of the RAM; in this case 1 GB. In other words, the process recycles if the memory used by the worker process goes beyond 1 GB. You monitor the worker process memory by using the process performance counter object and the private bytes counter.

Cache the Right Data

It is important to cache the right data. If you cache the wrong data, you may adversely affect performance. Cache application-wide data and data that is used by multiple users. Cache static data and dynamic data that is expensive to create or retrieve. Data that is expensive to retrieve and that is modified on a periodic basis can still provide performance and scalability improvements when managed properly. Caching data even for a few seconds can make a big difference to high volume sites. Datasets or custom classes that use optimized serialization for data binding are also good candidates for caching. If the data is used more often than it is updated, it is also a candidate for caching.

Do not cache expensive resources that are shared, such as database connections, because this creates contention. Avoid storing DataReader objects in the cache because these objects keep the underlying connections open. It is better to pool these resources. Do not cache per-user data that spans requests — use session state for that. If you need to store and to pass request-specific data for the life of the request instead of repeatedly accessing the database for the same request, consider storing the data in the HttpContext.Current.Cache object.

Refresh Your Cache Appropriately

Just because your data updates every ten minutes does not mean that your cache needs to be updated every ten minutes. Determine how frequently you have to update the data to meet your service level agreements. Avoid repopulating caches for data that changes frequently. If your data changes frequently, that data may not be a good candidate for caching.

Cache the Appropriate Form of the Data

If you want to cache rendered output, you should consider using output caching or fragment caching. If the rendered output is used elsewhere in the application, use the cache API to store the rendered output. If you need to manipulate the data, then cache the data by using the cache API. For example, if you need the data to be bound to a combo box, convert the retrieved data to an ArrayList object before you cache it.

Use Output Caching to Cache Relatively Static Pages

If your page is relatively static across multiple user requests, consider using page output caching to cache the entire page for a specified duration. You specify the duration based on the nature of the data on the page. A dynamic page does not always have to be rebuilt for every request just because it is a dynamic page. For example, you might be able to cache Web-based reports that are expensive to generate for a defined period. Caching dynamic pages for even a minute or two can increase performance drastically on high volume pages.

If you need to remove an item from the cache instead of waiting until the item expires, you can use the HttpResponse.RemoveOutputCacheItem method. This method accepts an absolute path to the page that you want to remove as shown in the following code fragment.


The caveat here is that this is specific to a server, because the cache is not shared across a Web farm. Also, it cannot be used from a user control.

Choose the Right Cache Location

The @OutputCache directive allows you to determine the cache location of the page by using the Location attribute. The Location attribute provides the following values:

·         Any. This is the default value. The output cache can be located on the browser client where the request originated, on a proxy server, or any other server that is participating in the request or on the server where the request is processed.

·         Client. The output cache is located on the browser client where the request originated.

·         DownStream. The output cache can be stored in any HTTP 1.1 cache-capable device except for the origin server. This includes proxy servers and the client that made the request.

·         None. The output cache is disabled for the requested page.

·         Server. The output cache is located on the Web server where the request was processed.

·         ServerAndClient. The output cache can be stored only at the origin server or at the requesting client. Proxy servers cannot cache the response.

Unless you know for certain that your clients or your proxy server will cache responses, it is best to keep the Location attribute set to Any, Server, or ServerAndClient. Otherwise, if there is not a downstream cache available, the attribute effectively negates the benefits of output caching.

Use VaryBy Attributes for Selective Caching

The VaryBy attributes allow you to cache different versions of the same page. ASP.NET provides four VaryBy attributes:

·         VaryByParam. Different versions of the page are stored based on the query string values.

·         VaryByHeader. Different versions of the page are stored based on the specified header values.

·         VaryByCustom. Different versions of the page are stored based on browser type and major version. Additionally, you can extend output caching by defining custom strings.

·         VaryByControl. Different versions of the page are stored based on the property value of a user control. This only applies to user controls.

The VaryBy attribute determines the data that is cached. The following sample shows how to use the VaryBy attribute.

<%@ OutputCache Duration=”30″ VaryByParam=”a” %>

The setting shown in the previous sample would make the following pages have the same cached version:

·         http://localhost/cache.aspx?a=1

·         http://localhost/cache.aspx?a=1&b=1

·         http://localhost/cache.aspx?a=1&b=2

If you add b to the VaryByParam attribute, you would have three separate versions of the page rather than one version. It is important for you to be aware of the number of variations of the cached page that could be cached. If you have two variables (a and b), and a has 5 different combinations, and b has 10 different combinations, you can calculate the total number of cached pages that could exist by using the following formula:

(MAX a × MAX b) + (MAX a + MAX b) = 65 total variations

When you make the decision to use a VaryBy attribute, make sure that there are a finite number of variations because each variation increases the memory consumption on the Web server.