In this article of our ASP.NET performance series, we’ll address the best practices for Data Access. Almost all ASP.NET applications use some form of data access. Data access is typically a focal point for improving performance because the majority of application requests require data that comes from a database.
Best Practices and Guidelines
We’ll point out the best practices and guidelines in the sections below.
Use Paging for Large Result Sets
Paging large query result sets can significantly improve the performance of an application. If you have large result sets, implement a paging solution that achieves the following:
· The paging solution reduces back-end work on the database.
· The paging solution reduces the size of data that is sent to the client.
· The paging solution limits client work.
Several paging solutions can be used and each solution solves problems inherent to a specific scenarios. We’ll summarize the solutions in in the next paragraph
A relatively quick and easy solution is to use the automatic paging provided by the DataGrid object. However, this solution works only for tables that have unique incrementing columns; it is not suitable for large tables. With the custom paging approach, you set AllowPaging and AllowCustomPaging properties to true, and then set the PageSize and VirtualItemCount properties. Then the StartIndex (the last browsed row) and NextIndex (StartIndex + PageSize) properties are calculated. The StartIndex and NextIndex values are used as ranges for the identity column to retrieve and display the requested page. This solution does not cache data; it pulls only the relevant records across the network.
There are several solutions available for tables that do not have unique incrementing column numbers. For tables that have a clustered index and and that do not require special server-side coding, use the subquery solution to track the number of rows to skip from the start. From the resulting records, use the TOP keyword in conjunction with the <pagesize> element to retrieve the next page of rows. Only the relevant page records are retrieved over the network. Other solutions use either the Table data type or a global temporary table with an additional IDENTITY column to store the queried results. This column is used to limit the range of rows fetched and displayed. This requires server-side coding.
Use a DataReader for Fast and Efficient Data Binding
Use a DataReader object if you do not need to cache data, if you are displaying read – only data, and if you need to load data into a control as quickly as possible. The DataReader is the optimum choice for retrieving read-only data in a forward-only manner. Loading the data into a DataSet object and then binding the DataSet to the control moves the data twice. This method also incurs the relatively significant expense of constructing a DataSet.
In addition, when you use the DataReader, you can use the specialized type-specific methods to retrieve the data for better performance.
Prevent Users from Requesting Too Much Data
Allowing users to request and retrieve more data than they can consume puts an unnecessary strain on your application resources. This unnecessary strain causes increased CPU utilization, increased memory consumption, and decreased response times. This is especially true for clients that have a slow connection speed. From a usability standpoint, most users do not want to see thousands of rows presented as a single unit.
Limit the amount of data that users can retrieve by using one of the following techniques:
· Implement a paging mechanism. For more information, see “How To: Page Records in .NET Applications” in the “How To” section of this guide.
· Design a master/detail form. Instead of giving users all of the information for each piece of data, only display enough information to allow the users to recognize the piece of data they are interested in. Permit the user to select that piece of data and obtain more details.
· Enable users to filter the data.
Consider Caching Data
If you have application-wide data that is fairly static and expensive to retrieve, consider caching the data in the ASP.NET cache.