Friday, 17 April 2015

Tracking Page Splits

As a DBA, performance tuning is a regular ongoing task. In my experience it is both a reactive and proactive task. The goal, though, is for your proactive efforts to minimise the time spent on the reactive. This has lead me to focus more on page splitting and index fill factor. Specifically, their interaction. There is a lot of information on the web about the benefits and dangers of fill factor and the performance implications of high rates of page splitting, so I will not go into it too much. I will discuss what I've done to monitor page splits. I then tweak fillfactor on the worst offending tables and indexes.

First off, if you want some good information on these concepts read the following articles:
The implementation of my solution involves the following:
  • Extended events that track page splits based on the second post above. They track two things 
    • a running total of splits at the db level 
    • a running total of splits at the table/index level.
  • Creation of two tables to keep a history of pagesplits
    • At database level
    • At table/index level
  • An SSIS package that:
    • Loads the results into a table in my AdminDB
    • Resets the counter by stopping extended events
    • Cleaning up the XML files
  • Create 2 SQL Agent Jobs 
    • Start the sessions 
    • Collect the data and stop the sessions
  • An SSRS report showing page split changes over time
My main idea is to use this in a targeted manner to identify page splitting hotspots. By no means should these extended events, namely the table/index specific one, be run continuously. Lots of data will be collected, especially if your system experiences high page split volumes. For more info see the Tracking Page Splits with Extended Events blog post for more.

The idea behind this is that two jobs are set up. The first checks for the existence of the objects and creates them if necessary. The second will run the SSIS package to both collect the data into the history tables and stop the sessions. They can then be scheduled in order to collect page split data for time periods you feel comfortable with, let's say for a few hours up to a full day. You can then interrogate the tables between runs and make any changes to indexes (i.e. FillFactor) that may help relieve the page splitting.

The extended event can be created as follows:
  • Database wide page splits

  • Table/Index specific page splits
  • The target of this event is an XML file for two reasons: 1. More data, e.g. database name, is exposed compared to the histogram; 2. The potentially high volume of data may cause memory pressure if the ring buffer or histogram is used. I've also set the SSIS package to delete the XML files after the data is collected in the tables below preventing large files form accumulating on disk.
The table scripts are:
  • Database wide page splits

  • Table/Index specific page splits

I have created the above tables in my AdminDB. Some of you may have read my earlier posts about index maintenance and the retention of index usage stats. This will come in handy when evaluating the page splits because you'll be maintaining a history, not only of index usage, but of index maintenance.

You can find the SSIS package on my Google Drive. The queries that collect the data from the Extended Event targets are as follows:
  • Database wide page splits
  • Table/Index specific page splits
The SSRS report, also downloadable from my Google Drive, that displays a graph of page splits by database, table and index over time:


Finally, here are links to the scripts for the two jobs mentioned above:



I hope that this has been helpful.

1 comment:

Anonymous said...

This works great! The only issue I am running into is on some db's we get the failure:

Msg 515, Level 16, State 2, Line 20
Cannot insert the value NULL into column 'Index_Name', table '@table'; column does not allow nulls. INSERT fails.

I am not sure if this has anything to do with partitioning.