Andreas Grabner About the Author

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

How to avoid the Top 5 SharePoint Performance Mistakes

Update Nov 27, 2014: Just posted this YouTube video that shows how to easily identify top SharePoint Performance Problems: SharePoint Performance Analysis in 15 Minutes
SharePoint is without question a fast-growing platform and Microsoft is making lots of money with it. It’s been around for almost a decade and grew from a small list and document management application into an application development platform on top of ASP.NET using its own API to manage content in the SharePoint Content Database.

Over the years many things have changed – but some haven’t – like – SharePoint still uses a single database table to store ALL items in any SharePoint List. And this brings me straight into the #1 problem I have seen when working with companies that implemented their own solution based on SharePoint.

The following blog shows my findings mainly using dynaTrace which you can also download and use for free on your environment.

#1: Iterating through SPList Items

As a developer I get access to a SPList object – either using it from my current SPContext or creating a SPList object to access a list identified by its name. SPList provides an Item’s property that returns a SPListItemCollection object. The following code snippet shows one way to display the Title column of the first 100 items in the current SPList object:

SPList activeList = SPContext.Current.List;
for(int i=0;i<100 && i<activeList.Items.Count;i++) {
  SPListItem listItem = activeList.Items[i];

Looks good – right? Although the above code works fine and performs great in a local environment it is the number 1 performance problem I’ve seen in custom SharePoint implementations. The problem is the way the Items property is accessed. The Item’s property queries ALL items from the Content Database for the current SPList and “unfortunately” does that every time we access the Item’s property. The retrieved items ARE NOT CACHED. In the loop example we access the Item’s property twice for every loop iteration – once to retrieve the Count, and once to access the actual Item identified by its index. Analyzing the actual ADO.NET Database activity of that loop shows us the following interesting result:

200 SQL Statements get executed when iterating through SPList.Items

200 SQL Statements get executed when iterating through SPList.Items

Problem: The same SQL Statement is executed all over again which retrieves ALL items from the content database for this list. In my example above I had 200 SQL calls totaling up to more than 1s in SQL Execution Time.

Solution: The solution for that problem is rather easy but unfortunately still rarely used. Simply store the SPListItemCollection object returned by the Items property in a variable and use it in your loop:

SPListItemCollection items = SPContext.Current.List.Items;
for(int i=0;i<100 && i<items.Count;i++) {
  SPListItem listItem = items[i];

This queries the database only once and we work on an in-memory collection of all retrieved items.

Further readings: The wrong way to iterate through SharePoint SPList Items and Performance Considerations when using the SharePoint Object Model

#2: Requesting too much data from the content database

It is convenient to access data from the Content Database using the SPList object. But – every time we do so we end up requesting ALL items of the list. Look closer at the SQL Statement that is shown in the example above. It starts with SELECT TOP 2147483648 and returns all defined columns in the current SPList.

Most developers I worked with were not aware that there is an easy option to only query the data that you really need using the SPQuery object. SPQuery allows you to:

a) limit the number of returned items
b) limit the number of returned columns
c) query specific items using CAML (Collaborative Markup Language)

Limit the number of returned items

If I only want to access the first 100 items in a list – or e.g.: page through items in steps of 100 elements (in case I implement data paging in my WebParts) I can do that by using the SPQuery RowLimit and ListItemCollectionPosition property. Check out Page through SharePoint Lists for a full example:

SPQuery query = new SPQuery();
query.RowLimit = 100; // we want to retrieve 100 items

query.ListItemCollectionPosition = prevItems.ListItemCollectionPosition; // starting at a previous position
SPListItemCollection items = SPContext.Current.List.GetItems(query);
// now iterate through the items collection

The following screenshot shows us that SharePoint actually takes the RowLimit count and uses it in the SELECT TOP clause to limit the number of rows returned. It also uses the ListItemCollectionPosition in the WHERE clause to only retrieve elements with an ID > previous position.

SPQuery.RowLimit limits the number of records retrieved from the SharePoint Content Database

SPQuery.RowLimit limits the number of records retrieved from the SharePoint Content Database

Limit the number of returned columns

If you only need certain columns from the List SPQuery.ViewFields can be used to specify which Columns to retrieve. By default – all columns are queried which causes extra stress on the database to retrieve the data, requires more network bandwidth to transfer the data from SQL Server to SharePoint, and consumes more memory in your ASP.NET Worker Process. Here is an example of how to use the ViewFields property to only retrieve the ID, Text Field and XZY Column:

SPQuery query = new SPQuery();
query.ViewFields = "<FieldRef Name='ID'/><FieldRef Name='Text Field'/><FieldRef Name='XYZ'/>";

Looking at the generated SQL makes the difference to the default query mode obvious:

SELECT clause only selects those columns defined in SPView or ViewFields

SELECT clause only selects those columns defined in SPView or ViewFields

Query specific elements using CAMLCAML allows you to be very specific about which elements you want to retrieve. The syntax is a bit “bloated” (that is my personal opinion) as it uses XML to define a SQL WHERE like clause. Here is an example of such a query:

SPQuery query = new SPQuery();
query.Query = “<Where><Eq><FieldRef Name=\”ID\”/><Value Type=\”Number\”>15</Value></Eq></Where>”;

As I said it is a bit “bloated” but it hey – it works 🙂

Problem: The main problem that I’ve seen is that developers usually go straight on and only work through SPList to retrieve list items resulting in too much data retrieved from the Content Database

Solution: Use the SPQuery object and its features to limit the number of elements and columns

Further readings: Only request the data you really need and Page through SharePoint lists

#3: Memory Leaks with SPSite and SPWeb

In the very beginning I said “many things have changed – but some haven’t”. SharePoint still uses COM Components for some of its core features – a relict of “the ancient times”. While there is nothing wrong with COM, there is with memory management of COM Objects. SPSite and SPWeb objects are used by developers to gain access to the Content Database. What is not obvious is that these objects have to be explicitly disposed in order for the COM objects to be released from memory once no longer needed.

Problem: The problem SharePoint installations run into by not disposing SPSite and SPWeb objects is that the ASP.NET Worker Process is leaking memory (native and managed) and will end up being recycled by IIS in case we run out of memory. Recycling means losing all current user sessions and paying a performance penalty for those users that hit the worker process again after recycling is finished (first requests are slow during startup).

Solution: Monitor your memory usage to identify whether you have a memory leak or not. Use a memory profiler to identify which objects are leaking and what is creating them. In case of SPSite and SPWeb you should follow the Best Practices as described on MSDN. Microsoft also provides a tool to identify leaking SPSite and SPWeb objects called SPDisposeCheck.

The following screenshot shows the process of monitoring memory counters, using memory dumps and analyzing memory allocations using dynaTrace:

Identifying leaking SPSite and SPWeb Objects

Identifying leaking SPSite and SPWeb Objects

Further reading: Identifying memory problems introduced by custom code

#4: Index Columns are not necessarily improving performance

When I did my SharePoint research during my first SharePoint engagements I discovered several “interesting” implementation details about SharePoint. Having only a single database table to store all List Items makes it a bit tricky to propagate index column definitions down to SQL Server. Why is that? If we look at the AllUserData table in your SharePoint Content Database we see that this table really contains columns for all possible columns that you can ever have in any SharePoint list. We find for instance 64 nvarchar, 16 ints, 12 floats, …

If you define an index on the first text column in your “My SharePoint List 1” and another index column on the 2nd number column in your “My SharePoint List 2” and so on and so on you would end up having database indices defined on pretty much every column in your Content Database. Check out the further reading link to a previous blog of mine – it gives you a good overview of how the AllUserData table looks like.

Problem: Index Columns can speed up access to SharePoint Lists – but – due to the nature of the implementation of Indices in SharePoint we have the following limitations:
a) for every defined index SharePoint stores the index value for every list item in a separate table. Having a list with let’s say 10000 items means that we have 10000 rows in AllUserData and 10000 additional rows in the NameValuePair table (used for indexing)
b) queries only make use of the first index column on a table. Additional index columns are not used to speed up database access

Solution: Really think about your index columns. They definitely help in cases where you do lookups on text columns. Keep in mind the additional overhead of an index and that multiple indices don’t give you additional performance gain.

Further readings: How list column indices really work under the hood and More on column index and their performance impact

#5: SharePoint is not a relational database for high volume transactional processing

This problem should actually be #1 on my list and here is why: In the last 2 years I’ve run into several companies that made one big mistake: they thought SharePoint is the most flexible database on earth as they could define Lists on the fly – modifying them as they needed them without worrying about the underlying database schema or without worrying to update the database access logic after every schema change. Besides this belief these companies have something else in common: They had to rewrite their SharePoint application by replacing the Content Database in most parts of their applications with a “regular” relational database.

Problem: If you read through the previous 4 problem points it should be obvious why SharePoint is not a relational database and why it should not be used for high-volume transactional processing. Every data element is stored in a single table. Database indices are implemented using a second table that is then joined to the main table. Concurrent access to different lists is a problem because the data comes from the same table.

Solution: Before starting a SharePoint project really think about what data you have – how frequently you need it and how many different users modify it. If you have many data items that change frequently you should really consider using your own relational database. The great thing about SharePoint is that you are not bound to the Content Database. You can practically do whatever you want including access to any external database. Be smart and don’t end up rewriting your application after you’ve invested too much time already.

Further reading: Monitoring individual List usage and performance and How list column indices really work under the hood

Final words

These findings are a summary of the work I did on SharePoint in the last 2 years. I’ve spoken at several conferences and worked with different companies helping them to speed up their SharePoint installations. Follow the links under Further Readings in the individual paragraphs or simply check out all SharePoint-related blog articles.

As SharePoint is built on the .NET Platform you might also be interested in my latest White Papers about Continuous Application Performance for Enterprise .NET Systems

About The Author
Andreas Grabner
Andreas Grabner Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi


  1. You should warn them about the first mistake… Choosing to use sharepoint at all.

  2. @Paul:well-SharePoint has its strengths-but you have to make sure you use them and avoid the weaknesses 🙂

  3. Blog made by platform is quiet interesting and easy to crawl by search engines.

  4. Excellent article

  5. Guys, how do you want to do that:

    query.ListItemCollectionPosition = 100;

    ListItemCollectionPosition has another type than Int. Paging with SharePoint SPQuery doesnt simple to implement 😉

  6. @VItal: Good catch – thanks. I will update the sample. I took it from my previous post that I also linked in the blog – – and modified it to make it easier without double checking it.
    You are correct – you need the ListItemCollectionPosition property from the SPListItemCollection object

  7. On WCMS sites you should work out good caching strategy and then use it. Also I suggest you to implement your own eager fetching strategy for web parts or controls that ask related data from more than one list (parent grid => subgrid => subsubsgrid). If you are able to minimize the number of CAML queries your SQL Server will be very happy 🙂

  8. @Gunnar: Thanks -> we all love HAPPY SERVERS 🙂

  9. Thanks for your share. Sharepoint enables employees to publish, share, search, analyse and manage information all through a browser.

  10. one option to enhance SharePoint’s Performance is by using caching techniques. below are some good reads about caching techniques for SharePoint performance

  11. Great article about sharepoint, I also use it but not like very much because the bandwith from some hosts are really stressed at some querries…and many problems on shared servers.

  12. Great Article

  13. Another option is to move out entities that need to store large amounts of data to SQL and use BCS

    Telerik has some nice controls (silverlight) and other web parts for SP 2010

    – Josh

  14. The code works good indeed and performs great in a local environment , I have tried and can confirm this. Thanks for sharing also about SDisposeCHeck…greta tool
    Nicholas from

  15. Dean Davis says:

    What tools are you using to see the performance of the queries and to produce that graph of SharePoint memory?

  16. Anil kumar says:

    We have a webpart crawling data from different lists and displaying the most recent activities. Its working fine in local environment but in production, it is loading very slowly even though the same no of lists its going to crawl..
    What is the approach to resolve these kind of issues?

  17. Its not surprising, really, how so many developers would fall into trap #1. In other circumstances (i.e. outside SharePoint) this code pattern would be quite acceptable. Great work Andreas, these points are all still very relevant.

  18. Hi Andreas,

    I know this is an old post but,

    Regarding point 2, I completely agree about the columns but even though i use 100 or 10 or 1 as the rowlimit, my normal users (members and visitors) still get a threshold error.
    My list has 10,000 items i cant do anything about that 🙁

    Any idea why?

    • Hi. What do you mean with “still get a threshold error”? Is that an error you get from SharePoint or from the DB? Or is this an error you get in dynaTrace as part of an Incident?

  19. This is an excellent post. #5 is something I have a difficult time convincing clients and management of.

  20. I wish the vendor that was developing our SharePoint application listened to me. I had asked them about sharepoint performance for a transactional application. I had asked whether we can design a RDBMS to house the data for sharepoint application.
    I did not know anything about SharePoint and the vendor response was content database can handle this.
    It turned out be not true and we had to decommission a developed application.

    • Hi. Sorry to hear that. I think SharePoint is a good platform – BUT – it is not a “one size fits all” platform. Unfortunately too many go down the wrong path and believe SharePoint (or other platforms) allows easy and rapid development without knowing the strengths and limitations of these platforms.

  21. Eddie Sandi Mora says:

    Thx for the article. I’d like to add that, still using SPList.Items API, could be very expensive. Think about a document library with 50 000 documents, getting all items at once from it could be terrible. Instead, you could use SPWeb.GetSiteData() API and iterate through the items, this will avoid to load all items in memory, and still getting all items if that is what your code needs to do.


  22. NIce Article.

  23. Hello Andreas! Great article. Look, I run a query directly on the SharePoint server and, for the info that i have been reading, it locked something on the SQL server, because now I cant edit anything on my DB!! When trying, system sends a message “An event receiver has cancelled the request”, What can i do???

  24. Guillermo says:

    Hello Andreas,
    Good article, thank you.
    Regarding #5, when we should consider a “high” volume of data process? Around 5 000 SPListItems?, 10 000?, xxxxx???

  25. Thanks for the useful information,keep updating with new topics

  26. thnx for sharing the useful information….kepp posting.for regular updates..

  27. Sharron Walt says:

    This is an amazing post. #4 is something I have a troublesome time persuading customers and administration of.

  28. Pikku Sharma says:

    Thanks for sharing this information….continue overhauling with new points

  29. Great Article, I’m starting with sharepoint development and need to export a huge list of items with attachments, Thanks Andreas,

Speak Your Mind


Do the math so we know you are human *