Bill’s 5-min. SharePoint Performance Recommendations

Performance recommendations and guidance is something I receive comment and question on quite frequently, typically in hallway conversation or in passing at conferences – so in the spirit of those occurrences I’ve decided to compile a quick list of SharePoint performance recommendations that can be conveyed verbally in five minutes or less.


Governance


Do limit the number of site collections/content database, I’m adamantly opposed to the “airline booking model” and much prefer what I like to refer to as the “accounting model” of database management, for example, if you know your maximum allowable site collection quota will be 5GB, and would like to keep your content databases at 100GB, logically you can host no more than 20 site collections per database and while this can result in a large number of content databases, you avoid site collection proliferation…with this model you should also take into account growth and set aside 5-10% to support schema changes, etc.  *Remember to size your content databases to the desired size as they are created – SharePoint Products and Technologies will provision the content database at xMB and configure growth to occur incrementally at 1MB chunks.  A smaller number of site collections in a content database not only benefits efficiency of operations, but can also mitigate the exposure of any locking that may occur within that database to a small subset of users as opposed to a large population.


Caching and Compression


Do consider and encourage site output caching, BLOB caching, and/or HTTP compression where possible – do be sure to closely monitor processor utilization where using HTTP compression.


Windows Server 2008


Do consider Windows Server 2008, the Next Generation TCP/IP Stack brings a number of benefits to bolster performance including receive window auto-tuning, compound TCP, improved routing path detection and recovery, and more.  There is a very informative whitepaper available “Enhanced Network Performance with Microsoft Windows Vista and Windows Server 2008“.  In regards to Windows Server 2008 SharePoint Products and Technologies, check back frequently as I observe our own results at Microsoft.


64-bit


Do consider 64-bit, the benefits here are too many to detail, but consider larger data processing chunks, larger address space, etc.


Do consider proactively addressing garbage collection on 64-bit machines, custom code and other external factors can potentially litter the address space with unintended assemblies and permanent memory allocations, while most is addressed with “managed memory” you may still experience conditions in which there is a Web server response delay until the buffer can be moved in memory.  Depending on the nature of the code you should consider monitoring .NET CLR Exceptions, Memory, and Loading Performance.  So you are faced with the decision to either proactively manage consumption through scheduled recycling, or monitor garbage collection activity dynamically and recycle when garbage collection activity has exceeded a pre-defined threshold – of these decisions the former implies configuring an arbitrary memory threshold and the latter understanding your individual requirements based on Web server configuration (I.e. resident services, hardware, load, etc.) which is the preferred methodology since you in effect will maximize the available memory while ensuring your Web servers remain responsive and mitigating availability issues as the result of too frequent a recycle in the case of configuring or designating a threshold.


Wide Area Networks


Consider WAN acceleration to manage traffic generated by satellite or remote offices and/or data replication scenarios.


Adjust IIS timeout settings to accommodate large file uploads by remote users on slow links, VPN, etc.


Storage Design and Database Architecture


Storage design and database architecture are other potential performance bottlenecks, carefully consider database distribution – where clustering, databases should be distributed across two or more instances – and with the underlying storage, provide as many spindles as possible to your data LUNs.


Authentication


Do consider Kerberos authentication where possible, significantly reducing the number of round trips per page versus NTLM.


More reading…


About Performance and Capacity Planning (Windows SharePoint Services)


Tools for Performance and Capacity Planning (Windows SharePoint Services)


Estimate Performance and Capacity Planning Requirements for Windows SharePoint Services Collaboration Environments


Posts Tagged “Performance” on this Blog

Just Published – Whitepaper: Database Maintenance for Microsoft SharePoint Products and Technologies

I’m pleased to announce the general availability of my most recent whitepaper, “Database Maintenance for Microsoft SharePoint Products and Technologies”.  This whitepaper describes the recommended maintenance strategies for the databases that host content and configuration settings for SharePoint Products and Technologies.


Read more…

HTTP Compression, Internet Information Services 6.0, and SharePoint Products and Technologies

A recent discussion on Garbage Collection management on 64-bit Web servers hosting Microsoft Office SharePoint Server 2007 led into a discussion on rendering performance, particularly steps to reduce overall rendering time at the client.  While monitoring client rendering can be achieved to some degree through measuring TTLB – purely in ensuring pages are served in a timely manner (see ASP.NET Performance Monitoring, and When to Alert Administrators for monitoring recommendations), there are too many variables that can result in overall performance variations to include browser, hardware, machine state, etc.  The most commonly implemented measures to improve client-side performance are object and output caching natively within Microsoft Office SharePoint Server 2007; however, often overlooked as a performance improvement mechanism is HTTP compression in Internet Information Services 6.0.  This article describes the basic steps to enabling HTTP compression in Internet Information Services 6.0.

Step 1 Enable Compression (Global)

  1. Open a Command Prompt and run the following script: cscript C:InetpubAdminScriptsadsutil.vbs set w3svc/filters/compression/parameters/HcDoStaticCompression true to enable compression on all Web applications installed on the current server farm.  (See later in this article references to enabling compression on sites and site elements).

Step 2 Specify File Extensions

  1. Open a Command Prompt and run the following script: cscript adsutil.vbs SET W3SVC/Filters/Compression/Deflate/HcFileExtensions "css js" to enable static compression on the file extensions denoted within the “” string.  CSS (Cascading Stylesheets) and JS (JavaScript) will provide the most in respect to performance gains with SharePoint Products and Technologies, for example, see the attached image of the compressed files directory on a single page view.  NOTE The size and location of the compression directory for static compression can be configured through modifying the HcCompressionDirectory, HcDoDiskSpaceLimiting, and HcMaxDiskSpaceUsed metabase properties.  Always backup the IIS Metabase before making any changes, for additional information on backing up the IIS Metabase visit http://www.microsoft.com/technet/serviceproviders/wbh4_5/CMSU_DR_Run_CONC_Back_Up_IIS_Metabase.mspx?mfr=true.

image
The sample image assumes a single render of a Microsoft Office SharePoint Server 2007 Publishing Page

Step 3 Create a Web Service Extension

Create a Web Service Extension referencing the gzip assembly for compression:

  1. Open Internet Information Services 6.0 Manager, expand the <SERVER_NAME> (local computer) node and right-click the Web Service Extensions node.
  2. Select Add a new Web service extension… from the menu.
  3. Specify a descriptive name for your Web Service Extension, for example, Compression, IIS Compression, etc. in the Extension name: field.
  4. Click Add… under Required files… and a reference to C:WINDOWSSystem32inetsrvgzip.dll.
  5. Select the checkbox labeled Set extension status to allowed and click OK on the New Web Service Extension window.

Step 4 Configure the IIS MetaBase

  1. Open C:WINDOWSsystem32inetsrvMetaBase.xml in a text editor (Notepad)
  2. Locate the <IISCompressionScheme> element.
  3. Find the HcScriptFileExtension metabase property and add aspx and asmx to the existing list ensuring the entries conform to the existing format, next modify the HcFileExtensions metabase property to specify any additional files to be compressed aside from those configured in the previous steps through adsutil.vbs.  The HcScriptFileExtension metabase property specifies dynamic files to be compressed when HTTP compression is enabled whereas the HcFileExtensions metabase property specifies static files.  NOTE Changes should be applied to both the deflate and gzip compression schemes.
  4. Modify the HcDynamicCompressionLevel to a value between 0 and 10, 0 being low compression and 10 being maximum compression.  The HcDynamicalCompressionLevel

NOTE You should consider testing the impact of varying compression levels in a laboratory environment closely monitoring CPU utilization and potential impact to your Web servers.  Typically a compression level between 7 and 9 provides optimum performance vs. CPU load in most circumstances.  If you are running IIS 7.0 see http://msdn2.microsoft.com/en-us/library/bb386460(vs.85).aspx for a description of IIS 6.0 to IIS 7.0 Metabase property mappings.

Sample MetaBase.xml [Snippet]

<IIsCompressionScheme Location=”/LM/W3SVC/Filters/Compression/deflate
        HcCompressionDll=”%windir%system32inetsrvgzip.dll
        HcCreateFlags=”0
        HcDoDynamicCompression=”TRUE
        HcDoOnDemandCompression=”TRUE
        HcDoStaticCompression=”FALSE
        HcDynamicCompressionLevel=”9
        HcFileExtensions=”htm
            html
            css
            js

        HcOnDemandCompLevel=”10
        HcPriority=”1
        HcScriptFileExtensions=”asp
            exe

>
</IIsCompressionScheme>
<IIsCompressionScheme Location=”/LM/W3SVC/Filters/Compression/gzip
        HcCompressionDll=”%windir%system32inetsrvgzip.dll
        HcCreateFlags=”1
        HcDoDynamicCompression=”TRUE
        HcDoOnDemandCompression=”TRUE
        HcDoStaticCompression=”TRUE
        HcDynamicCompressionLevel=”9
        HcFileExtensions=”htm
            html
            css
            js

        HcOnDemandCompLevel=”10
        HcPriority=”1
        HcScriptFileExtensions=”asp
            exe
            axd

>
</IIsCompressionScheme>

In the sample metabase snippet above both deflate and gzip are enabled, it is not recommended to disable one or the other due to potential for unintended consequences such as failure to compress responses to a particular browser, e.g. compatibility issues.  In some cases you may wish to enable compression at only the site or site element level as opposed to global application.  To disable compression open a Command Prompt and run the following script:  cscript C:InetpubAdminScriptsadsutil.vbs set w3svc/filters/compression/parameters/HcDoStaticCompression false.  To enable compression on sites or individual site elements see http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/25d2170b-09c0-45fd-8da4-898cf9a7d568.mspx?mfr=true.

Step 5 Restart the World Wide Web Publishing Service

  1. Open a Command Prompt and enter NET STOP W3SVC and allow the World Wide Web Publishing Service to stop.  When the World Wide Web Publishing Service has stopped, enter NET START W3SVC.

To determine whether or not affordable gains were provided through enabling HTTP compression you should baseline server performance both prior to and following enabling of HTTP compression using the Processor% Processor Time and Network InterfaceBytes Sent/sec performance monitor counters.  Generally where the Processor% Processor Time value exceeds 80%, HTTP compression is not recommended.

Regardless on whether or not you elect to leverage HTTP compression the important takeaway from this conversation is that SharePoint Products and Technologies performance can be improved beyond the native concepts available to the platform such as Site Output and BLOB Caching which I am planning separate posts to provide prescriptive guidance on implementation.

Resources

Enabling HTTP Compression (IIS 6.0)

Using HTTP Compression for Faster Downloads (IIS 6.0)

Customizing the File Types IIS Compresses (IIS 6.0)

Troubleshooting HTTP Compression in IIS 6.0

Announcing the Release of the SharePoint Monitoring Toolkit 1.0

The SharePoint Monitoring Toolkit enables administrators and IT Pros to manage SharePoint Products and Technologies deployments, both large and small, by introducing two (2) new management packs for System Center Operations Manager 2007.


Both the Microsoft Office SharePoint Server 2007 and Windows SharePoint Services 3.0 management packs replace the existing management packs and have been engineered to leverage the rich features of System Center Operations Manager 2007.  Improvements include:



  • Additional rules to include performace rules

  • Improved and new reports and views

  • Backwards compatible dependencies have been removed

  • Increased overall reliability

  • Tuning and event supression reducing redundancy

  • and more…

To download the Solution Accelerator visit:  http://go.microsoft.com/fwlink/?LinkID=103032 or to download the management packs visit:  http://www.microsoft.com/technet/prodtechnol/scp/catalog.aspx.

Managing Master Merge in Microsoft Office SharePoint Server 2007

Master merge compiles all index data comprised of both in-memory and disk-based structures into one disk-based structure to prevent the degradation of the search service. SPS 2001 allowed administrators to manipulate the master merge schedule through HKEY_LOCAL_MACHINESoftwareMicrosoftSearch1.0CatalogNamesSharePointPortalServerworkspace_nameIndexer:ci:MidNightMasterMergeTimeDelta, but only to a limited degree of the specification of minutes past 12:00 A.M. that master merge should occur. Microsoft Office SharePoint Server 2007 does not provide an out of the box mechanism for manipulating the scheduling of the master merge process; however, options are available to script a scheduled master merge operation.


In Microsoft Office SharePoint Server 2007 content indexes are propagated from the index server to each query server in the Web farm, the full index is propagated during query server initialization and only incremental changes in the index are propagated on an ongoing or continual basis. Depending on your environments rate of change, you may seek to control propagation of incremental changes to mitigate potential performance issues as a result of the size of your corpus or available bandwidth between servers.


The attached script can be used to force master merge if the document count since the preceding master merge is greater than a specified percentage, for example, a setting of 5% will provide room to crawl x number of documents without starting a master merge the next day. The out of the box master merge starts on the conclusion of a crawl and the number of documents updated since the previous master merge is greater than 10%.


To use this script, save the attached code as .vbs and run on each query server in your Web farm – it will perform the same operation on all Shared Service Providers that are serviced on the Web farm. Execute the script as cscript .vbs 5% or optionally call the script as a scheduled task through a batch or command file.

Sub ScheduleMasterMerge( AppName, Pct )
  dim cdocsInMasterIndex, cDocsInShadowIndexes, PctActual

  Set globalAdmin = WScript.CreateObject(“OSearch.GatherMgr.1”, “”)
  set application = globalAdmin.GatherApplications(AppName)
  set project = application.GatherProjects(“Portal_Content”)
  cdocsInMasterIndex = project.StatusInfo(3)
  cDocsInShadowIndexes = project.StatusInfo(4)

  if 0 <> cDocsInShadowIndexes+cdocsInMasterIndex Then
    PctActual = 100 * cDocsInShadowIndexes/(cDocsInShadowIndexes+cdocsInMasterIndex)

    if PctActual > Pct then
      project.ForceMerge(0) 
      Wscript.Echo “Successfully scheduled Master Merge.”
      end if 
    else
  wscript.Echo “No documents in index. Scheduling failed.”
  end if

End Sub

dim RegPath
dim Pct
dim Keys
const HKEY_LOCAL_MACHINE = &H80000002
RegPath = “SoftwareMicrosoftOffice Server12.0SearchApplications”

  Pct = CLng(wscript.arguments(0))

  Set oReg=GetObject(“winmgmts:{impersonationLevel=impersonate}!\.rootdefault:StdRegProv”)
  oReg.EnumKey HKEY_LOCAL_MACHINE, RegPath, Keys
  For Each subkey In Keys
    call ScheduleMasterMerge( subkey,Pct )
  Next