{"id":2106,"date":"2015-11-03T00:00:00","date_gmt":"2015-11-03T00:00:00","guid":{"rendered":"https:\/\/test.simple-talk.com\/uncategorized\/routine-sql-dml-testing-for-the-unenthusiastic-tester\/"},"modified":"2021-05-11T15:57:20","modified_gmt":"2021-05-11T15:57:20","slug":"routine-sql-dml-testing-for-the-unenthusiastic-tester","status":"publish","type":"post","link":"https:\/\/www.red-gate.com\/simple-talk\/databases\/sql-server\/tools-sql-server\/routine-sql-dml-testing-for-the-unenthusiastic-tester\/","title":{"rendered":"Routine SQL DML Testing for the Unenthusiastic Tester"},"content":{"rendered":"<div class=\"spaced-bottom padded--tight scheme--lightest-grey\">\n<img decoding=\"async\" class=\"media__image\" style=\"float: left; vertical-align: middle;\" src=\"https:\/\/assets.red-gate.com\/external\/SimpleTalk\/database-devops-cog-92.png\" \/> <\/p>\n<p class=\"padded-top--tight\"><strong>DevOps, Continuous Delivery & Database Lifecycle Management<\/strong><br \/>Continuous Integration<\/p>\n<\/div>\n\n<p>I find testing to be a very tedious business: All developers do. Testing SQL databases is something you like doing if you find the process of testing applications too exciting. The answer for the rest of us is, of course, to mechanise as much of the test process as possible.<\/p>\n<p>Here is a simple form of SQL DML test for SQL Select statements. I think of it as a type of integration test. Where you are doing Continuous Integration, it is a good way of ensuring that nothing serious is broken. By &#8216;serious&#8217;, I&#8217;d include poor performance as well as the wrong result.<\/p>\n<p>The idea is that you just develop, in your favourite IDE such as SSMS or VS, a batch of SQL queries or execute statements. You check them \u00a0as necessary to make sure they&#8217;re right. You then save the file in a directory. You do as many tests as you like, and use as many subdirectories as you like.<\/p>\n<p>Here is an imaginary test. Don&#8217;t worry about the SQL Statements, they are just there to demonstrate<\/p>\n<pre class=\"theme:ssms2012 lang:tsql\">\t--the first query\r\n\tPRINT '(first)'\r\n\tPRINT '-SaveResult'\r\n\tSELECT\u00a0 p1.ProductModelID\r\n\tFROM\u00a0\u00a0\u00a0 Production.Product AS p1\r\n\tGROUP BY p1.ProductModelID\r\n\tHAVING\u00a0 max(p1.ListPrice) &gt;= ALL \r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0  ( SELECT avg(p2.ListPrice)\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 FROM\u00a0\u00a0 Production.Product AS p2\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 WHERE\u00a0 p1.ProductModelID = p2.ProductModelID );\r\n\t\u00a0\r\n\t--the second query\r\n\tPRINT '(Second)'\r\n\tSELECT\u00a0 sum(s.TotalDue), count(*), right(convert(CHAR(11), s.OrderDate, 113), 8), TerritoryID\r\n\tFROM\u00a0\u00a0\u00a0 Sales.SalesOrderHeader AS s\r\n\tGROUP BY right(convert(CHAR(11), s.OrderDate, 113), 8), TerritoryID\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 WITH ROLLUP\r\n\tORDER BY min(s.OrderDate)\r\n\t\u00a0\r\n\t\u00a0 \r\n\t--and the third query \r\n\tPRINT '(final)'\r\n\tPRINT '-noMoreResult'\r\n\tSELECT\u00a0 p.FirstName + ' ' + p.LastName AS customer, SalesOrderNumber AS CurrentOrder, \r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 first_value(SalesOrderNumber) OVER ( PARTITION BY s.CustomerID ORDER BY SalesOrderNumber\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 ROWS BETWEEN UNBOUNDED PRECEDING\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 AND CURRENT ROW ) AS FirstOrder, \r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 last_value(SalesOrderNumber) \r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 OVER ( PARTITION BY s.CustomerID ORDER BY SalesOrderNumber\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 ROWS BETWEEN CURRENT ROW\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 AND UNBOUNDED FOLLOWING ) AS LastOrder, \r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 s.TotalDue, convert(CHAR(11), s.OrderDate, 113) AS DateOrdered\r\n\tFROM\u00a0\u00a0\u00a0 Sales.SalesOrderHeader s\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 INNER JOIN Sales.Customer AS c ON s.CustomerID = c.CustomerID\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 INNER JOIN Person.Person AS p ON c.PersonID = p.BusinessEntityID\r\n\tWHERE\u00a0\u00a0 orderDate &gt; dateadd(MONTH, -1, ( SELECT max (orderDate) FROM Sales.SalesOrderHeader ))\r\n\tORDER BY OrderDate\r\n\t<\/pre>\n<p>You&#8217;ll notice some print statements here. They&#8217;re only necessary if you want to do something different to the default behaviour. When you bang the button to run the test, every query in a batch will have its own timings, results, I\/O information and execution plans, so when they&#8217;re saved to disk, they need a unique name.<\/p>\n<p>That <code>PRINT '(final)' <\/code>not only gives it a name but also makes it easier to pick out the messages that belong to each query. The test harness saves the result to disk, using that name as filename and in the XML.<\/p>\n<p>Sometimes, you&#8217;ll have a query that has an enormous result that you don&#8217;t want saved to disk. \u00a0That <code> PRINT '-noMoreResult' <\/code> deals with that. As the name suggests from then on in the batch, unless you then do<code> PRINT '-SaveResult'<\/code>. Yes, there are ways of being more controlling if you need to be.<\/p>\n<p>A lot of the time, you will want to make sure that the result that is returned is correct. This is why the result is automatically saved as an XML file. If you leave a file with the correct result in the directory, then the PowerShell script obligingly compares the result with that and produces a report that tells you what the differences are.<\/p>\n<p>That&#8217;s it, basically.<\/p>\n<h1>Why do it?<\/h1>\n<p>Throughout development, you must continuously check that certain key queries produce the correct data, and do it quickly. Not only should they do it quickly on the test or QA server, but also under realistic loads. They must use effective execution plans.<\/p>\n<p>This means testing a number of relevant SQL Expressions or batches against maybe several databases, normally replicas of the same database in the same version, but stocked with different amounts of data. I do it to see whether the result is what it should be, how long it took and what were the CPU or I\/O resources used. If a problem crops up, I will want the query plan that was used.\u00a0 These SQL Queries could be simple or complex, and they are intended to check out the vital functions to make sure you never pass on a release that is severely broken. I do it on every build, so it has to be able to work unattended. The benefit is that if something breaks, I can be pretty sure of what did it: it is what I&#8217;ve done on the day before the test. Because you have the execution plans, you can email an expert for help as well.<\/p>\n<p>There are plenty of test frameworks that will do this sort of test, but nothing I&#8217;ve come across that does it all, in a way that is \u00a0unobtrusive and easy to use. Above all, I want it to be easy to change the SQL. (Actually, I like changing the PowerShell too, but that is another story)<\/p>\n<h1>What does it do?<\/h1>\n<p>With PowerShell you get some timing metrics for free with the Measure-Command cmdlet that gives you the end-to-end time. This helps give you the end-to-end time for a whole batch, but this isn&#8217;t sufficient. We also need to get a rough idea of the server CPU and timing, the query execution plan and maybe even a more accurate measure of the time taken to execute the command on the server. Basically, you don&#8217;t want to be bothered with the results of all this unless something goes wrong. You also will want to specify a number of server instances\/databases to do the test run with.<\/p>\n<p>For me, the most important thing is to be able to add a number of queries to a file and have them executed against a particular database and server.<\/p>\n<h1>How does it do it?<\/h1>\n<p>Here is a simple test harness to show the guts of the script. The actual script does rather more but this sample should work if you fill in the server, database credentials and the path to the test. Here, I leave out the distracting routine stuff of getting each database and each batch, and just show the important stuff. As you can see from the script, it is a simple task that executes the batch, and checking the results by monitoring the messages that come back for errors and special statements that switch on or off certain features. For comparisons of results, I use the .NET library for comparing XML files. You will need to fill in the details (look for <code>'<\/code>These below need to be filled in! &#8216;) before it will work.<\/p>\n<pre class=\"theme:vs2012 lang:ps decode:true \">#make sure no errors slip through\r\nset-psdebug -strict; $ErrorActionPreference = \"stop\"\r\n\r\ntry { Add-Type -path \"${env:ProgramFiles(x86)}\\XmlDiffPatch\\Bin\\xmldiffpatch.dll\" }\r\n #load xmldiffpatch to test results\r\ncatch #oops, he hasn't installed it yet\r\n{\r\n  write-warning @'\r\nThis routine currently compares results to make sure that the results\r\nare what they should be. It uses XMLDiff, a NET tool. It can be downloaded\r\nfrom here. \r\nIt only does so if you leave a file with the CorrectResult suffix in\r\nthe filename. If you don't want this facility, remove it! \r\n'@;\r\n  exit;\r\n}\r\n\r\n$xmlDiff = New-Object Microsoft.XmlDiffPatch.XmlDiff;\r\n# Create the XmlDiff object\r\n$xmlDiff = New-Object Microsoft.XmlDiffPatch.XmlDiff(\r\n  [Microsoft.XmlDiffPatch.XmlDiffOptions]::IgnoreChildOrder);\r\n#poised to test results against what they should be\r\n#here is the SQL batch for testing. It a real routine this would be pulled off disk\r\n# but we need to keep this test simple\r\n$SQL =@\"\r\n--the first query\r\nprint '(first)'\r\nprint '-SaveResult'\r\nSELECT p1.ProductModelID\r\n   FROM Production.Product AS p1\r\n   GROUP BY p1.ProductModelID\r\n   HAVING MAX(p1.ListPrice) &gt;= ALL\r\n    (SELECT AVG(p2.ListPrice)\r\n   FROM Production.Product AS p2\r\n   WHERE p1.ProductModelID = p2.ProductModelID);\r\n\r\n--the second query\r\nprint '(Second)'\r\nSELECT sum(s.TotalDue), count(*),\r\n right(convert(CHAR(11),s.OrderDate,113),8),TerritoryID\r\nFROM Sales.SalesOrderHeader AS s\r\n GROUP BY right(convert(CHAR(11),s.OrderDate,113),8), TerritoryID WITH ROLLUP\r\n ORDER BY min(s.OrderDate)\r\n\r\n \r\n--and the third query \r\nprint '(final)'\r\nprint '-noMoreResult'\r\nSELECT\r\n  p.FirstName+ ' '+p.LastName AS customer,\r\n  SalesOrderNumber AS CurrentOrder,\r\n  FIRST_VALUE(SalesOrderNumber)\r\n    OVER(PARTITION BY s.CustomerID\r\n      ORDER BY SalesOrderNumber\r\n      ROWS BETWEEN UNBOUNDED PRECEDING\r\n      AND CURRENT ROW) AS FirstOrder,\r\n  LAST_VALUE(SalesOrderNumber)\r\n    OVER(PARTITION BY s.CustomerID\r\n      ORDER BY SalesOrderNumber\r\n      ROWS BETWEEN CURRENT ROW\r\n      AND UNBOUNDED FOLLOWING) AS LastOrder,\r\n  s.TotalDue,\r\n  convert(CHAR(11),s.OrderDate,113) AS DateOrdered\r\nFROM Sales.SalesOrderHeader s\r\n  INNER JOIN Sales.Customer AS c\r\n    ON s.CustomerID = c.CustomerID\r\n  INNER JOIN Person.Person  AS p\r\n    ON c.PersonID = p.BusinessEntityID \r\nWHERE orderDate &gt; DateAdd(MONTH,-1,(SELECT max(orderDate) FROM Sales.SalesOrderHeader)) \r\nORDER BY OrderDate\r\n\r\n\"@\r\n\r\n$ErrorActionPreference = \"Stop\" # nothing can be retrieved\r\n#--------------------These below need to be filled in! -----------------------------\r\n$pathToTest = \"$env:USERPROFILE\\MyPath\\\"\r\n$databasename = \"AdventureWorks\" #the database we want\r\n$serverName = 'MyServer' #the name of the server\r\n$credentials = 'integrated Security=true' # fill this in before you run it!\r\n# if SQL Server credentials, use 'user id=\"MyID;password=MyPassword\"'\r\n#--------------------These above need to be filled in! -----------------------------\r\n\r\n#now we declare our globals.\r\n$connectionString = \"Server=$serverName;DataBase=$databasename;$credentials;\r\n\tpooling=False;multipleactiveresultsets=False;packet size=4096\";\r\n#connect to the server\r\n$message = [string]''; # for messages (e.g. print statements)\r\n$Name = '';\r\n$LastMessage = '';\r\n$SQLError = '';\r\n$previousName = '';\r\n$SavingResult = $true;\r\n$ThereWasASQLError = $false;\r\ntry #to make the connection\r\n{\r\n  $conn = new-Object System.Data.SqlClient.SqlConnection($connectionString)\r\n  $conn.Open()\r\n}\r\ncatch #can't make that connection\r\n{\r\n  write-warning @\" \r\nSorry, but I can't reach $databasename on the server instance $serverName. \r\nMaybe it is spelled wrong, credentials are wrong or the VPN link is broken.\r\nI can't therefore run the test.\r\n\"@;\r\n  exit\r\n}\r\n# This is the beating heart of the routine. It is called on receipt of every\r\n# message or error\r\n$conn.add_InfoMessage({#this is called on every print statement or message \r\n    param ($sender, #The source of the event\r\n      $event) #the errors, message and source\r\n    if ($event.Errors.count -gt 0) #there may be an error\r\n    {\r\n      $global:SQLError = \"$($event.Errors)\"; #remember the errors\r\n      $global:ThereWasASQLError = ($global:SQLError -cmatch '(?im)\\.SqlError: *\\w')\r\n      #you may think that if there is an error in the array... but no there are false alarms\r\n    };\r\n    $global:LastMessage = $event.Message; #save the message\r\n    $global:message = \"$($message)`n $($global:LastMessage)\";#just add it\r\n    switch -regex ($global:LastMessage) #check print statements for a switch\r\n    {\r\n      '(?im)\\((.{2,25})\\)' #was it the name of the query?\r\n      {\r\n        $global:Name = $matches[1] -replace '[\\n\\\\\\\/\\:\\.]', '-'; #get the name in the brackets\r\n        $null &gt; \"$pathToTest$($name).io\"; #and clear out the io record for the query\r\n        break;\r\n      }\r\n      '(?im)-NoMoreResult' { $global:SavingResult = $false; break; } #prevent saving of result\r\n      '(?im)-SaveResult' { $global:SavingResult = $true; break; } #switch on saving of result\r\n      default\r\n      { #if we have some other message, then record the messge to a file\r\n        if ($name -ne '') { \"$($event.Message)\" &gt;&gt; \"$pathToTest$($name).io\"; }\r\n      }\r\n    }\r\n  }\r\n  );\r\n  $conn.FireInfoMessageEventOnUserErrors = $true; #collect even the errors as messages.\r\n  #now we do the server settings to get IO and CPU from the server. \r\n  # We do them as separate batches just to play nice\r\n  @('Set statistics io on;Set statistics time on;', 'SET statistics XML ON;') |\r\n  %{ $Result = (new-Object System.Data.SqlClient.SqlCommand($_, $conn)).ExecuteNonQuery(); }\r\n  #and we execute everything at once, recording how long it all took\r\n  try #executing the sql\r\n  {\r\n    $timeTaken = measure-command { #measure the end-to-end time\r\n      $rdr = (new-Object System.Data.SqlClient.SqlCommand($SQL, $conn)).ExecuteReader();\r\n    }\r\n  }\r\n  catch\r\n  {\r\n    write-warning @\" \r\nSorry, but there was an error with executing the batch against $databasename\r\non the server instance $serverName. \r\nI can't therefore run the test.\r\n\"@;\r\n    exit;\r\n  }\r\n  if ($ThereWasASQLError -eq $true)\r\n  {\r\n    write-warning @\" \r\nSorry, but there was an error '$SQLError' with executing the batch against $databasename\r\non the server instance $serverName. \r\nI can't therefore run the test.\r\n\"@;\r\n  }\r\n  if (-not(test-path \"$pathToTest\")) \r\n      {new-item \"$pathToTest\" -type directory}\r\n  #now we save each query, along with the query plans\r\n  do #a loop\r\n  {\r\n    if ($name -eq $previousName) #if we have no name then generate one that's legal\r\n    {\r\n      $Name = ([System.IO.Path]::GetRandomFileName() -split '\\.')[0]\r\n    }#why would we want the file-type?\r\n    #the first result will be the data so save it\r\n    $datatable = new-object System.Data.DataTable\r\n    $datatable.TableName = $name\r\n    $datatable.Load($rdr)#pop it in a datatable\r\n    if ($SavingResult) { $datatable.WriteXml(\"$pathToTest$($name).xml\"); }\r\n    #and write it out as XML so we can compare it easily\r\n    else #if we aren't saving the result delete any previous tests\r\n    {\r\n      If (Test-Path \"$pathToTest$($name).xml\")\r\n      {\r\n        Remove-Item \"$pathToTest$($name).xml\"\r\n      }\r\n    }\r\n    $datatable.close; #and close the datatable\r\n    if ($rdr.GetName(0) -like '*showplan')#ah we have a showplan!!\r\n    {\r\n      while ($rdr.Read())#so read it all out quickly in one gulp\r\n      {\r\n        [system.io.file]::WriteAllText(\"$pathToTest$($name).sqlplan\", $rdr.GetString(0));\r\n      }\r\n    }\r\n    $previousName = $name #and remember the name to avoid duplicates\r\n    #now we wonder if the DBA has left an XML file with the correct result?   \r\n    if (test-path \"$pathToTest$($name)CorrectResult.xml\")\r\n    { #there is a correct result to compare with!\r\n      $CorrectResult = [xml][IO.File]::ReadAllText(\"$pathToTest$($name)CorrectResult.xml\")\r\n      $TestResult = [xml][IO.File]::ReadAllText(\"$pathToTest$($name).xml\")\r\n      if (-not $xmlDiff.Compare($CorrectResult, $TestResult))#if there were differences....\r\n      { #do the difference report\r\n        $XmlWriter = New-Object System.XMl.XmlTextWriter(\"$pathToTest$($name).differences\", $Null)\r\n        $xmlDiff.Compare($CorrectResult, $TestResult, $XmlWriter)\r\n        $xmlWriter.Close();\r\n        $message = \"$message`nDifferences found to result of query '$name'\"\r\n      }\r\n      else #remove any difference reports with the same name\r\n      {\r\n        If (Test-Path \"$pathToTest$($name).differences\")\r\n        {\r\n          Remove-Item \"$pathToTest$($name).differences\"\r\n        }\r\n      }\r\n    }\r\n  }\r\n  while ($rdr.NextResult())# and get the next if there is one\r\n  $rdr.Close()\r\n  #now save all the messages for th batch including the errors.\r\n  $message &gt; \"$($pathToTest)all.messages\"\r\n  #and add the end-to-end timing.\r\n  \"End-To-End time was $($timeTaken.Milliseconds) Ms\" &gt;&gt; \"$($pathToTest)all.messages\"<\/pre>\n<p>And an inspection of the directory shows that the files are there<\/p>\n<p>\u00a0<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/imported\/2302-clip_image001.png\" alt=\"2302-clip_image001.png\" width=\"485\" height=\"195\" \/><\/p>\n<p>The execution plans can be clicked on and inspected if you have SQL Server Management Studio, or Visual Studio installed. If not you can drag the files and drop them on <a href=\"http:\/\/sqltuneup.sqlservercentral.com\/\">SQL Server Central&#8217;s SQL Tuneup page<\/a><\/p>\n<p>The .io files and all.messages files are text files, but this is a script so you can call them whatever suits you.<\/p>\n<p>\u00a0That &#8216;final.io&#8217; file has the following information:<\/p>\n<pre>\tSQL Server Execution Times:\r\n\t\u00a0\u00a0 CPU time = 0 ms,\u00a0 elapsed time = 0 ms.\r\n\t\u00a0\r\n\t\u00a0SQL Server Execution Times:\r\n\t\u00a0\u00a0 CPU time = 0 ms,\u00a0 elapsed time = 0 ms.\r\n\tTable 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\r\n\tTable 'Person'. Scan count 1, logical reads 106, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\r\n\tTable 'Customer'. Scan count 1, logical reads 123, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\r\n\tTable 'SalesOrderHeader'. Scan count 2, logical reads 1372, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.\r\n\t\u00a0\r\n\t\u00a0SQL Server Execution Times:\r\n\t\u00a0\u00a0 CPU time = 266 ms,\u00a0 elapsed time = 263 ms.\r\n<\/pre>\n<p>The all.Messages file has all the messages from all the individual SQL Select statements, but also the following<\/p>\n<pre>\t\u00a0End-To-End time was 40 Ms<\/pre>\n<p>If you want the PowerShell to detect differences in the result you just put in an XML file with the correct result (usually from a previous verified run). Here is a run with an error in it, showing the file with the correct result<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/imported\/2302-clip_image002.png\" alt=\"2302-clip_image002.png\" width=\"554\" height=\"227\" \/><\/p>\n<p>The XML files will need an XML editor to view them. If there is a difference between what there should be and what there is, there will be an XML difference report.(in this case in the file <b>Second.differences<\/b><\/p>\n<pre>\t&lt;?xml version=\"1.0\"?&gt;&lt;xd:xmldiff version=\"1.0\" srcDocHash=\"17444750020386653172\" options=\"IgnoreChildOrder \" fragments=\"no\" xmlns:xd=\"http:\/\/schemas.microsoft.com\/xmltools\/2002\/xmldiff\"&gt;&lt;xd:node match=\"2\"&gt;&lt;xd:node match=\"6\"&gt;&lt;xd:node match=\"1\"&gt;&lt;xd:change match=\"1\"&gt;154205.5859&lt;\/xd:change&gt;&lt;\/xd:node&gt;&lt;\/xd:node&gt;&lt;xd:node match=\"1\"&gt;&lt;xd:node match=\"1\"&gt;&lt;xd:change match=\"1\"&gt;170471.7903&lt;\/xd:change&gt;&lt;\/xd:node&gt;&lt;\/xd:node&gt;&lt;xd:node match=\"4\"&gt;&lt;xd:node match=\"1\"&gt;&lt;xd:change match=\"1\"&gt;189921.3994&lt;\/xd:change&gt;&lt;\/xd:node&gt;&lt;\/xd:node&gt;&lt;\/xd:node&gt;&lt;\/xd:xmldiff&gt;\r\n\t\u00a0\r\n<\/pre>\n<h1>The Full Script<\/h1>\n<p><a href=\"https:\/\/www.red-gate.com\/simple-talk\/wp-content\/uploads\/imported\/2302-PoShSQLTester.html\">Here is the full script<\/a>. (it is too big to embed in the article). Anyone who reads my PowerShell articles will know that I prefer to use either the registered server group of the central management servers, rather than specify connections. I set up a test group, usually with all supported levels of SQL Server.<\/p>\n<p>I have the practical difficulty that I sometimes have the same physical server under two different names and so I correct for that just to get the unique physical servers, along with their connection strings. For good measure, I have a list of servers to avoid (the ones at a version level I&#8217;m not interested in supporting)<\/p>\n<pre class=\"theme:ssms2012 lang:tsql\">\tget-childitem 'SQLSERVER:\\sqlregistration\\' -recurse | where { $_.GetType() -notlike '*ServerGroup*' } |\r\n\t\u00a0 Select servername, connectionstring | where-object\u00a0 { $avoidList -NotContains $_.servername } |\r\n\t\u00a0 sort-object -property servername -unique |\r\n\t\u00a0 foreach-object {\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 and so on ...\r\n<\/pre>\n<p>For this version, I&#8217;ve set it to pull out all your registered servers, but I use a version that looks in a subgroup so I change the path to the list of test databases. You can use it either way because you have the\u00a0 <b>$AvoidList<\/b> \u00a0to prevent servers getting into the list that you don&#8217;t want to test against.<\/p>\n<p>For each server, I then run all the tests, serially. I get these from whatever files I can find, including subdirectories. I then filter out any files I don&#8217;t want.<\/p>\n<pre class=\"theme:ssms2012 lang:tsql\">\tGet-ChildItem $pathToTest -Recurse -directory\r\n\tForeach ($folder in (Get-ChildItem $pathToTest -Recurse -Filter '*.sql'|Where-Object {$_.Extension -eq '.sql' }))\r\n\t\u00a0\u00a0\u00a0{\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0if (($ExcludeFiles -notin $Folder.Name) -and ($excludeDirectories -notContains $folder.Directory.name))\r\n\t\u00a0\u00a0\u00a0\u00a0\u00a0 { \r\n\tAnd so on ...\r\n<\/pre>\n<p>The result files are contained in subdirectories named after the name of the instance that ran the test, and the name of the result.<\/p>\n<p>Because it is so easy to get the registered servers using \u00a0the SQL Server PowerShell provider, I use that, so there is something else you&#8217;ll need to worry about before you run the code.<\/p>\n<h1>Wrapup<\/h1>\n<p>This routine doesn&#8217;t replace any of the standard tests, but it can catch errors that would otherwise cause problems to these tests, or maybe break the CI build. I find it handy because it takes almost no effort to run, doesn&#8217;t leave artifacts in my code, and gives me, saved on file, everything I need to see what the problem might be (including anythe SQL Error Message). I can even archive the data and spot trends. Naturally, I&#8217;ve given you just enough to customise a script to your particular requirements, and I hope that I&#8217;ve kickstarted someone to use PowerShell to the max for database development.<\/p>\n<div class=\"scheme--lightest-grey spaced-bottom\">\n  <img decoding=\"async\" class=\"media__image padded--tight\" style=\"float: right; vertical-align: middle;\" src=\"https:\/\/assets.red-gate.com\/external\/SimpleTalk\/database-devops-cog-160.png\"><\/p>\n<div class=\"padded\">\n<p class=\"padded-top--tight\"> <strong>DevOps, Continuous Delivery & Database Lifecycle Management<\/strong><br \/> <a href=\"\/collections\/database-lifecycle-management-patterns-practices-library\/?utm_source=simpletalk&amp;utm_medium=publink&amp;utm_campaign=dlm&amp;utm_content=pandparticle&amp;utm_term=bottombanner\">Go to the Simple Talk library<\/a> to find more articles, or visit <a href=\"https:\/\/www.red-gate.com\/solutions?utm_source=simpletalk&amp;utm_medium=publink&amp;utm_campaign=dlm&amp;utm_content=pandparticle&amp;utm_term=bottombanner&amp;__hstc=81186632.9b3ff88e8dc727f9f1f608ffe1681edd.1467735864337.1482415357732.1482419905332.123&amp;__hssc=81186632.36.1482419905332&amp;__hsfp=3698596899\">www.red-gate.com\/solutions<\/a> for more information on the benefits of extending DevOps practices to SQL Server databases.<\/p>\n<\/div>\n<\/div>\n\n","protected":false},"excerpt":{"rendered":"<p>There are more exciting things in life than unit testing SQL Statements, checking the results, timings, and execution plans. Phil Factor demonstrates a PowerShell-based technique taking the tedium out of testing SQL DML.&hellip;<\/p>\n","protected":false},"author":154613,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[143532],"tags":[4635,4150,4213],"coauthors":[6813],"class_list":["post-2106","post","type-post","status-publish","format-standard","hentry","category-tools-sql-server","tag-powershell","tag-sql","tag-sql-tools"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/posts\/2106","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/users\/154613"}],"replies":[{"embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/comments?post=2106"}],"version-history":[{"count":12,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/posts\/2106\/revisions"}],"predecessor-version":[{"id":90951,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/posts\/2106\/revisions\/90951"}],"wp:attachment":[{"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/media?parent=2106"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/categories?post=2106"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/tags?post=2106"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.red-gate.com\/simple-talk\/wp-json\/wp\/v2\/coauthors?post=2106"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}