viernes, 10 de febrero de 2012

Improving SharePoint’s Performance

The database

The database is a common area where performance hits are most prevalent. SharePoint is database intensive, about 95% of SharePoint is stored in a SQL database: files, images, videos, pages, content, user profiles, etc. It’s important to have a happy and health SQL database running.

Things to do or check

  • Make sure databases are running on RAID5 or RAID10 partitions. These two RAID configurations provide the fasted throughput for constant read/write activities. If you’re on a RAID1, consider moving your databases.
  • Check your drive’s free space. Make sure you have several gig available. If it gets too small SQL still start acting funny, and slower. If it get to a gig or under it’ll stop working all together.
  • Check your content database size and growth properties in SQL Server Management Studio. If these are set too low, then SQL has to resize itself a lot and that can slow it down.

To check your database properties

  • Connect to the database server using Microsoft SQL Server Management Studio.
  • Right click your database and select Properties.
  • Go to Files on the left.
  • You should now see 2 rows for your database.
  • Select the one with a File Type of Rows Data.
    • This represents your database file, where all the stuff is stored.
    • Note the initial size. If this site is new and young and doesn’t have much on it, set that initial size to an estimated file size for the future. If it’s an existing content database with a load of data, don’t worry about the initial size. By setting this above and beyond, it’ll resize the database now, instead of later.
    • Note the Autogrowth setting. Modify this to be 25%-50%, higher if the site will be loaded up with a lot of data coming soon. Again, this will resize the database once and cover a lot of data. The default property values requires SQL to resize frequently, and therefore imped performance of the server while it’s resizing.
  • Now select the File Type Log.
    • This represents your transaction logs. As items are being written to and from the database file, the transaction log keeps a log of it, in the event of a database failure, you can technically restore back to the last transaction. This log can grow exceptionally large depending on traffic on your sites.
    • Following the same rules as above, the initial size should be around 25% of the data file initial size.
    • Set autogrowth to 25%-50%, again depending on amount data that will be loaded.
  • If possible, these two files should be on different physical disks, not just different logical disk partitions.
    • This might be difficult as most servers have a single RAID container. In larger implementations, if you have additional drives available, separate the data and log files. Moving them onto their own physical disks or disk arrays, the drives can spin independent of each other, therefore improving performance.
  • Next, go to Options on the left
  • The Collation option should be Latin1_General_CI_AS_KS_WS (SharePoint’s preferred setting).
    • This setting handles how the database treats some finer settings like case sensitivity. If the database was created by SharePoint, you’re fine. If a DBA or someone else made the database first, this might not be set correctly.
  • Click OK.
  • Expand System Databases on the left, and do the same as the above to the tempdb.
    • The tempdb is a temporary location data is handled while it waits for other processes to finish. The tempdb prefers RAID10 over all.

Another consideration for improving database performance is RBS, I go into greater detail on RBS in this post.

SharePoint Logs

SharePoints logs are usually on the drive it was installed on, and in most cases that’s the C drive of the server. It’s recommended to moved these logs to another partition, and preferably different drives altogether.

  • Go to Central Administration > Monitoring > Configure Diagnostic Logging.
    • Note the top part, Event Throttling. This specifies what SharePoint will log. If any of these have been changed from default, they’ll be in bold. Your ideal settings for all of them are
    • Least critical event to report to the event log: Information
    • Least critical event to report to the trace log: Medium
    • In the next section, Event Log Flood Protection, make sure that’s checked.
    • And in Trace Log section, feel free to move the logs to a different drive to either free up space on the current drive or to a unique drive. You may limit the amount of days or space the logs use per your preference.
    • Click OK.

Large lists are slow

If you have lists that have several thousand records in them, those lists may run slow as you load views. Other lists which use your large lists in a lookup field may also slow down as it has to process all of the records.

You can enable indexing on your SharePoint fields within your lists to improve lookup speeds and page loads. Go to your list settings, then under the column list is a link Indexed Columns. Select your columns that you will be searching against, creating filtered views with or using in lookups.

Keeping it alive

This issue is less of an issue as it’s just how technology works. If SharePoint (rather the services running SharePoint: IIS) runs idle for a set amount of time, usually 20 minutes, the services will go to sleep and cache is cleared. As a result, when a user goes to access SharePoint, the services have to start up, and build the cache again. This can set an initial load to take several seconds, which to an end user can be forever. There’s a neat little application available at which tickles your sites to keep them awake.

Funky Code

How many custom features are you using? Does it seem to slow down when they’re in use? This might be hard to determine as a highly used site might be hitting custom code left and right. If you’re a developer, or want to tell your developer something, check these out

  • Make sure all SPSite and SPWeb objects are disposed of cleanly. There is a free tool which will check your compiled solution for you, download SPDisposeCheck here
  • If you’re adding new items to a list that has thousands of records, don’t simply perform SPList.Items.Add(). This will actually load all your items into memory, then create a new record. Instead, do something like SPQuery qryEmpty = new SPQuery() { Query = “0″ }; SPList list = web.lists["Name"]; SPListItem newItem = list.GetItems(qryEmpty).Add(); This will load an empty results set first, then give you a new item. Also, check out the note above on Large lists are slow.
  • Check the status of workflows, ensure they aren’t looping. This is a common issue with developing custom workflows. If the workflow updates the item the workflow is running on, it’ll loop because it was updated again. And around and around you go. Make sure to include some validators in your workflow to ensure you want to perform the update, not just update it.



No hay comentarios:

Publicar un comentario