Fun with SQL Server Profiler trace files and PowerShell

Running Profiler traces against multiple servers becomes a painful process when it's time to collate and filter all that data. It would be time-consuming, frustrating and messy if Laerte hadn't written this handy PowerShell script (complete with examples) to help you out.

I was starting to get annoyed: I had all my trace files neatly separated into separate folders, but as I’d checked Enable File Rollover when I set up Profiler, it was creating multiple copies of the same trace file. To make things trickier, there were three shifts of DBAs, and each was using different trace settings. I tried to bring together the various trace files columns into the same, single file. As you might have guessed, this was really hard to do!

As the cherry on the top of this frustrating puzzle, some of our servers were very busy, and so the possibility of speedily importing the trace files into SQL Server tables was negated by the overhead it would have generated, so I had to handle the import to SQL Server at a later stage. Although it’s a simple procedure to run fn_trace_gettable for one file, when you have multiple files it quickly becomes complicated to generate a uniform and repeatable script… more so if you have to merge all the different trace files.

It was then that I decided to use PowerShell.

When I began to think of a solution, one of the requirements was to have SQL Server tables generated automatically ($filetotable). There was no problem in setting this parameter, but I wanted to centralize all the files in a single table and, as I mentioned, these files could come with different events and columns.

This became a problem when I began to see how many lines of code and conditions that this solution would require. To be fair, I was thinking in procedural terms, as I previously wrote my functions in Visual Basic. I’d forgotten one of the main features of PowerShell: Consistency.

Why should I write all this code, if the output of a cmdlet is an object and can be the input of the next cmdlet? With that in mind… let’s check out the code I came up with, as well as a few examples.

First let’s cover the parameters; the functions work with 4 parameters, one of them (the first) being mandatory.

  • $TraceFileName – A mandatory String specifying the full SQL Server trace file path e.g. “C:\Temp\Profiler.trc” or “C:\temp\*.trc

    This parameter specifies which trace file needs to be read. If you want to use more than one file, which as I said may each come with different events and columns, specify that all *. trc files in a given location will be read. Otherwise, just select the one file.

  • $FileToTable – An optional boolean flag instructing the function to insert all data into SQL tables, divided according to .trc file.

    A little more detail: this parameter specifies whether the files read will be sent to tables in SQL Server, where the function will set up different tables for each file, with the name “POWERSHELLTRACETABLE_” followed by the name of the trace. We will see below that if this parameter is set to true, then we need to pass the name of the Server and Database for these files. If not specified, the default values will be used.

  • $ServerName – The optional server name string. If this is not specified and $FileToTable is set to true, then default server will be used when importing the .trc files.
  • $DatabaseName – the optional database name string. As with $ServerName, if this is not specified and $FileToTable is set to true, then TEMPDB will be used when importing the .trc file.

Get-SQLProfiler Script Code

You can download this script from the top of this article or from my SkyDrive. Now, let’s do some tests! I created a function module containing just this function and imported it into my powershell profile, just to make our test easier. So, to import this module, I type the following in my PowerShell profile:

Reading One Trace File and not Insert into SQL table.

Let’s say we have a SQL Server profiler trace file at C:\Temp\ProfilerTraces\TSQL.trc which looks something like this:

1187-Laerte1.jpg

SQL Server trace file.

This file contains the events and columns of the SQL Server Profiler TSQL template, and if we execute:

…we’ll get this result:

1187-Laerte2.jpg

Figure 2. The sample trace file, processed by the Get-SQLProfiler function.

The LineNumber and TraceFile properties will be in each resulting object of this function, the former being (obviously) the line number, which increases incrementally, and the latter being the name of the trace. Remember, these properties match the columns in the trace file, even if I’m merging files with different columns and events.

Let’s see just the LineNumber, TraceFile, EventClass and TextData properties, filtering for just the lines smaller than or equal to 5:

1187-Laerte3.jpg

Figure 3. The trace file, filtered and displaying just a subset of properties.

Reading One Trace File and Insert into SQL table.

With the previous example, we’d have the same result if the $FiletoTable parameter was set to $true, but it would mean that we would now have a SQL Server table containing this data:

As we’ve not specified the server and database, a warning message is displayed, informing us that the function is using default values:

1187-Laerte4.jpg

Figure 4. the Get-SQLProfiler function displays a warning when it is forced to use default values.

And the resulting SQL Server table will look something like this:

1187-Laerte5.jpg

Figure 5. The SQL Server table created by Get-SQLProfiler, containing the requested data.

Reading Multiple Trace Files and Insert into SQL table.

Let’s say we have a SQL Server Profiler trace file rollover (like the one I described at the start of this post). If you want to merge your results, simply enter the path where the trace files are stored:

And you can see in the figure below shows when the file being read changes:

1187-Laerte6.jpg

Figure 6. Reading multiple .trc files using Get-SQLProfiler.

As I mentioned, Get-SQLProfiler will created one SQL table for each trace file. However, if you want all the trace files placed in a single table, save the result of the cmdlet into an XML (or txt) file, which you can import soon after the invoke-sqlcmd. I have an example demonstrating this below.

Reading Multiple Trace Files With Different Events and Columns

We can also use Get-SQLProfiler to merge two trace files with different columns. In the example here, I merge the files TSQL_SP.trc and TSQL_Locks.trc.

1187-Laerte7.jpg

Figure 7. Merging files with different columns using Get-SQLProfiler.

Aha, I can hear you thinking:

Well that’s all well and good Laerte, but if I merge multiples files with different setups, how can I now find the available properties using Get-SQLProfiler?”

Simple: apply a filter for a single row, such as linenumber = 1, because LINENUMBER and TRACEFILE will always be commonly available in Get-SQLProfiler:

1187-Laerte8.jpg

Figure 8: Determining which properties are representing in the merged data.

Now you just need to get the complete list, choose the fields you’re interested in, and specify them in the | select parameter. Remember that when merging multiple files with the $FiletoTable option set to $true, Get-SQLProfiler creates several tables, one for each file.

But Laerte, I wanted to centralize everything into one file.

I confess that I wanted to do this through a parameter when I first started writing this solution, and was intimidated by how much code I thought it would require. Yet, as I said, PowerShell can resolve this situation with just two more lines of code, and without having to bloat my function. First I create a table with an XML column; for my tests, I’ll put it in tempdb:

Now let’s get our PowerShell to work; we just save our merged output to XML (and we can still choose which properties we want to focus on), and then we can import the resulting file into SQL Server:

Now, with a simple Xquery, we can list all of the data in the XML files:

1187-Laerte9.jpg

Figure 9. Querying the merged data of multiple trace files.

The beauty of this solution is that I don’t need to apply any more parameters to handle this, as the consistency characteristic of PowerShell does it all for me; I just need to pipe and save. Now we can read our trace files, save them to XML, filter them by any property, and then merge and import them into a SQL Table…and do so effortlessly.

However, being who I am, I’ve kept tinkering with module, and I’ll briefly walk through some of the final adjustments I’ve made for the (at the time of writing) latest version, which you can download at the top of this article.

The first change I made was to remove the parameters for Get-SQLProfiler to insert results into a SQL Server table, as I wanted to see if it was useful to only insert the data into one common table after applying filters (or manipulating the data any way you want). Remember, if you want to use different tables for each .TRC file, just use the fn_trace_gettable TSQL function.

So, now that I was filtering and manipulating my data before moving it into a table, I created another function, Save-InfoToSQLTable, to actually take care of the insert. So, with this latest version of the module, you can filter, sort, do essentially whatever you want to your data, and then insert your curated dataset into a SQL Server table.

Let’s see some examples of how this works:

Read All Traces found in C:\Profiler\Compras\ and filter for dbo.fc_pco_round_000 in TextData:

Create the SQL Server table and save the filtered data into it (Table Name created by the function):

Create the SQL Server table and save the filtered data into it (Table Name specified via a parameter):

Saving the data into a SQL Server table which has already been created:

Saving the data into a SQL Server table which has already been created with a UserName and Password:

Before we finish, there are some considerations which you should bear in mind:

  • This module only works with PowerShell X86.
  • If you’re use large files, use I suggest you use T-SQL and the fn_trace_gettable function. In my tests, the T-SQL was superior in terms of upload speed.

I see use-case of this module as being a situation where you have small files (e.g. during a rollout) and want to filter by various conditions, such as seen in the example above. Once you’ve already filtered the data, you can save it for easy access at a later date.

Thanks to my friend Shay Levy who, as always, takes time to share his immense knowledge. Jeffrey Hicks too, who offered a lot of tips on mandatory parameters in functions, as well as my SQL Server Day buddy Felipe Ferreira for his XQuery advice, and obviously Chad Miller, the master of PowerShell with SQL Server.