Product articles
SQL Toolbelt
Database Builds and Deployments
SQL Change Automation with PowerShell…

SQL Change Automation with PowerShell Scripts: getting up-and-running

Provides a PowerShell automation script that will build a SQL Server database from the scripts in source control, document the database and then deploy it as a NuGet package, using it to update the schema of a live database.

Guest post

This is a guest post from Phil Factor. Phil Factor (real name withheld to protect the guilty), aka Database Mole, has 30 years of experience with database-intensive applications.

Despite having once been shouted at by a furious Bill Gates at an exhibition in the early 1980s, he has remained resolutely anonymous throughout his career.

He is a regular contributor to Simple Talk and SQLServerCentral.

I’ll demonstrate a PowerShell automation script, to take the contents of a GitHub source control directory for a database, check that it is possible to build the database, and then document the database. Finally, it will create a NuGet package and place the code in it, along with the documentation and use it to update the database schema of an existing database so that it matches the schema of the source control version that we just built and validated. For database automation work, the PowerShell script needs to adopt good practices for passing credentials securely and adopt robust error checks, so that you have more confidence that things went well

I’m on the liberal wing of opinion about how to do database delivery as part of the application. No sooner does one person decide on the royal road to stress-free database delivery, someone else comes up with a wildly different idea that is just as successful. Tools that assist with this process have become a lot more flexible to use, and SQL Change Automation is typical of this trend, having evolved from DLM Automation suite by adding support for migration-based development.

This is the first in a series of articles to illustrate how to use the PowerShell cmdlets of SQL Change Automation to achieve parts of the deployment process. I’ll assume you have everything installed, such as PowerShell, the SqlServer PowerShell module and SQL Change Automation, and that you’ve tried out the basic tutorials in the documentation; they are all worth working through, and once you have them running successfully, you’ll be ready for more.

PowerShell scripting for SQL automation tasks

When you are automating a process, the scripts that you use, whether Bash, command-line or PowerShell, share several principles. Unlike a demonstration example of the use of a Cmdlet, or an illustration of a process in an article, a real script must run unattended, and in it you mustn’t mix data, especially passwords, with process. The script must not report progress and errors to the screen, because a process that runs on the scheduler doesn’t have a screen. Even if there is a screen, the information provided is too ephemeral, and the significance of a strange message seen by the early morning shift operator may be missed. No, it is far better that progress, information and errors should be logged to a file, along with the time and date.

A script should either run to completion or fail leaving the state as it was. Whether it completes or fails, it should mop up (tear down) afterwards. Where possible, it should never just ‘bomb out’.

To present a script like this in an article would make the listing very long and extremely dull, but to present a script without any indication of how a production script should look is misleading. I’ve tried to sketch in some of the requirements and show how one might deal with them, but the main objective is to get you past the pain threshold of getting something working.

What the PowerShell automation script does

The PowerShell script uses the SCA PowerShell cmdlets, along with the SqlServer PowerShell module, to do part of a continuous build process. This script keeps all the information we need in a single PowerShell System.Collections.Hashtable called $config. This keeps things neat, and allows us to store the data separate from the code, when we get to the point that we need to have one script for several tasks, each with their own data values.

The project will be in a local github ‘project’ directory with the object-level SQL source files in it. So that we don’t tax ourselves unduly, we’ll use Northwind. A copy of this project will be attached to the article. SCA will also work happily with a single build script, or an SCA offline schema model, but we’ll use a SQL Compare scripts folder.

Our aim is to test that the project can be built into a database. We could also run some tSQLt unit tests to make sure that the source is good, and maybe also a SQL CodeGuard check that the source contains no code smells, but I thought that at this point we should keep the complexity down.

Checking for database connections

We’ll start by checking that all the servers can be reached, with the connection information provided. The scripts uses the New-DatabaseConnection cmdlet to get an SCA connection object, which we store for use later on, and the Test-DatabaseConnection cmdlet to check the connection.

Checking connections is a more elaborate a process than you might expect because we can’t, in real life, assume that we can use windows authentication, because of Azure, remote servers outside the domain, and other such complications. Connection information will therefor include credentials, which must be stored securely.

We can’t ever keep passwords for SQL Server authentication as part of the static script data. At this stage, we ask for passwords if they aren’t known, and otherwise store them as secure strings on file, in the user area, protected by the workstation security. We can make the connection checks, and abort on failure, rather than wasting time starting the process before discovering a problem that is a complete job-stopper.

Checking for valid paths

Next, will verify that the file locations you specify exist. It is always best to check that paths are valid and react accordingly. If it is the path to the project then you’d probably want to abandon the build right there, whereas files for logs and data can be created before use.

Checking the database source

To check the source, SCA builds the empty database using Invoke-DatabaseBuild. It will be happy to use a localdb version of SQL Server to do this, but I’d much rather not to do it that way because this needs to be the same server installation, (version and features) as the one the database needs. So, instead, we supply a SQL Server instance to act as host for a temporary version of the database. SCA Invoke-DatabaseBuild builds the database and if it succeeds without any problems, then we can proceed. Whatever else, any problems are logged.

Writing the database documentation

Once we know that the database is valid, in the sense that it can be built from source without breaking, the next step is to produce database documentation, with the New-DatabaseDocumentation cmdlet, which uses the code from SQL Doc. It takes the extended properties in the database, and uses dependency tracker, to provide a more useful documentation of the database than is possible by just inspecting the source code.

Creating the NuGet Package for deployment

Next, we use New-DatabaseBuildArtifact to produce the build artefact, a NuGet package, containing the database source, and all the documentation, which we can then view, or install on a webserver. The NuGet package is exported to a file directory, because we don’t have a NuGet Server. The NuGet File can be used by SCA to update a database to the same version.

Updating the SQL Server database to the same version

Now we have a valid build, which we have used to create the NuGet package, we can then synchronise a target database to the same version and save the script that did it, using Sync-DatabaseSchema. For a build, you’ll need to create the empty database on the target server, and then sync it, to update it to match the version in source control.

If this is successful, the script is saved, which will show the team exactly what was changed, to migrate the database from the previous version to the current version. They may also want to change it. It can be used as the basis of a migration script that converts a database from one specific version of the database to another. The advantage of doing a sync is that we can sync with a development server which already has the test data, and ancillaries such as SSIS packages, already in place, saving time.

Importing the project configuration information

I don’t demonstrate this in the script because it would be a distraction but you can read PSON files as PowerShell. You save a file like the one shown in Listing 1, but with the real parameter values filled in.

Listing 1: BuildData.ps1

Then, you read it in to your script and do whatever checks you please.

Listing 2: Running checks for missing build data

One thing that I’ve left out is the PowerShell routine for taking the database version number from source control, or from whatever tool maintains it. I have a dummy version number in the script, but this ought to be over-written with the source-control-specific value. ($ git rev-parse HEAD in Git).

The PowerShell automation script for building and deploying a database

Let’s start with a simple version of the script, but which performs the same basic functions as the more complicated (but also more robust) one I’ll show shortly.

Listing 3: The simplified PowerShell script

Obviously, I’ve just assumed a SQL Server login with a (gulp) embedded password in the script and we’re a bit short of error checking. We will need to do better, which sadly will make it all seem rather more complicated. However, as we move on to do more powerful things, the value of the framework we use here will kick in.

In putting values into the script, you will need some caution. Do not put the log file in a Git directory, because Git might lock the file and prevent the script from reporting errors. If you are using Windows authentication for a server, put an empty string into the user name field.

Listing 4: The full PowerShell code (SCABuildAndSync.ps1)

Conclusions

We’ve looked at just a small aspect of SCA, the PowerShell Cmdlets, and just tried a small part of the functionality. There is a lot of power in there. Most of the work in the script is in checking things such as server connections, paths and build scripts. The actual SCA tasks represented by the SCA cmdlets will be as easy as Lego to add in, once there is a viable framework for doing so. To prove this point, I’ll next use the same principles demonstrated here to update an entire test cell, creating the databases wherever necessary and stocking them with masked test data. I’ve already added the module that gives us access to SMO so I’ll be showing you some examples of how we’ll use that to extend SCA’s functionality.

Tools in this post

SQL Change Automation

Automate your database changes with CI and automated deployment

Find out more

SQL Toolbelt

The industry-standard tools for SQL Server development and deployment.

Find out more