Since writing this article, a number of Redgate product development changes have taken place. Some of the references in the article are now out of date, so here’s a summary of the changes that have taken place:
- We now have a TFS Build Template that simplifies the setup of Redgate tools on TFS considerably. This can be downloaded as part of DLM Automation – http://www.red-gate.com/products/dlm/dlm-automation-suite/.
- Redgate no longer sells Deployment Manager. As an alternative, everything described here can be set up using Octopus Deploy – http://octopusdeploy.com/.
- The SQL Automation Pack, mentioned in the article, contains command line tooling that is now included in DLM Automation.
If you have any questions about automating the deployment of Database Projects using the latest versions of our database lifecycle management tools, please contact the DLM team for more information or an online demo: email@example.com.
Generally, when users are implementing continuous delivery for their databases, using Redgate tools (SQL Source Control, SQL Automation Pack, Deployment Manager – see here for an overview), they are using Redgate’s own script folder format, working in SSMS, so that the database can be source controlled and managed as part of the delivery process.
However, we have users who are also using Microsoft’s Database Projects format, using SQL Server Data Tools (SSDT) in Visual Studio. Can these users implement Continuous Delivery for their database changes if using Database Projects format? Even if some on the team are using SSMS, and others Visual Studio? Yes they can! Hopefully this post will help any users trying to implement this strategy.
Pre-requisites – I’m using the following for my setup:
- TFS 2013 both for my source control and build server, hosted on a Windows Azure VM,
- Visual Studio Pro 2013 on client machine,
- (localdb) for my dev database environment,
- Windows Azure SQL Database for my CI and “Live” databases,
- SQL Server Data Tools installed on TFS Build agent,
- Redgate SQL Source Control v184.108.40.206,
- Redgate SQL Automation Pack v220.127.116.11,
- Redgate Deployment Manager v18.104.22.168 (though updated weekly, so always changing!)
I won’t be going through the setup of these pre-requisites – setting up TFS 2013 is reasonably straightforward, as is, for example, provisioning a Windows Azure SQL Database through the Azure management portal. The Redgate tools can all be downloaded from the links in the first paragraph, and each have setup instructions.
There are a few other installations that need to be carried out along the way (for example, installing .NET 3.5 on the TFS build agent if absent) – I’ve included these in the steps below, to make clear why and where these are needed. Also, there are a number of gotcha errors that can occur – I’ve mentioned these, where appropriate. We’re currently working on significantly improving the TFS setup and install process – please get in touch if you’d be interested with helping with this research at firstname.lastname@example.org.
Finally, this process can be just as easily implemented with different source control systems (for example, Git, Mercurial, SVN), build servers (e.g. TeamCity, Jenkins, Bamboo) and SQL instances – please get in touch if you hit problems with these configurations.
Using Visual Studio and SSMS with Database Projects
First of all I’m going to create a very simple database project in Visual Studio, with a single table.
- Open Visual Studio 2013:
- Create a new Team Project (NB: I’m using Team Projects here, linked to my TFS server), and call the new project RG:
Enter the project name:
- Click through the rest of the Wizard, using Team Foundation Version Control, and Finish to create the project on your TFS server.
- Next, create a new database on (localdb) to work as your development environment. In Visual Studio, open the SQL Server Object Explorer (if not available, click on SQL Server Object Explorer in the VIEW menu). Under the SQL Server node there should be a instance which is (localdb)\Projects. In the Databases folder under this node, right click and “Add New Database”:
- Create a new database called “RGDBProject” and click OK:
- Then, in your new database, right-click “Create New Project”:
- In the next box, enter “RGDBProject” as the project name and check both boxes for “Create directory for solution” and “Add to source control”:
- After pressing OK, select Team Foundation Version Control for the version control system:
- In the next step, choose your previously created “RG” Team Project as the location for storing the solution and project:
- Once the database is imported, click Finish.
- Next, the local database project needs to be checked in to TFS. In the Source Control Explorer, expand the folder for the Team Project RG, then, on the RGDBProject folder, right-click and select “Check in Pending Changes”:
- Add a commit comment and click Check in:
- The next step is to link to this source controlled version of the database from SSMS. NB: Here I will create a new database in SSMS and link to the source control repository. This is because I am modelling two different developers working on the same DB – one in Visual Studio on his/her database, the other in SSMS on his/her own dedicated database.
- Prior to linking the database however, an amendment needs to be made the the SQL Source Control config file, to work with TFS 2012 or higher (also applies if using Visual Studio Online). Please follow the instructions at http://documentation.red-gate.com/display/SOC3/Using+Team+Foundation+Server+2012+or+Visual+Studio+Online to make this change.
- After doing this, open SSMS and connect to (localdb)\Projects:
- Create a new database by opening a New Query window, and entering the command:
CREATE DATABASE RGDBProject_dev;
- Then, in SSMS Object Explorer, right-click on this database and click on “Link database to source control..”:
- In the next screen, enter the location of your TFS server, then Browse.. next to the Database Folder box to select the location of the folder containing the Database project files – make sure you select the appropriate, lower level folder here, as shown:
- Click “Select”, then “Link” in the “Link to source control” pane. This will show the following warning message – support for SSDT projects in SQL Source Control is currently beta functionality:
- Click OK. We’ve now linked our local development database to the database project RGDBProject, as shown by the green database icon:
- The linking process pulls the schema in to your database. However, we haven’t actually got any objects in our database yet! To add a new test table, open a New Query window and run the following:
CREATE TABLE [dbo].[Customers]( [ID] [nchar](10) NULL, [Name] [varchar](50) NULL, [Address] [varchar](255) NULL ) ON [PRIMARY]
- This now needs to be checked in to source control (as shown by the blue blob by the table in Object Explorer). In the SQL Source Control tab, click on Check in, enter a comment and commit:
- When you now return to Visual Studio (or rather, when your colleague opens Visual Studio to work on this project), because you’ve added a new object you need to check-out the latest version to your machine, by clicking the Check Out button in Source Control Explorer:
Creating a Build Step in TFS to run Continuous Integration
Now that we’ve got our database under source control – in SSMS and Visual Studio – the next step is to create a CI process that kicks off whenever check-ins are made.
Setting up the build process in TFS is a little long-winded. We’re working heavily right now on improving the set up process (such that this post will become obsolete!). If you’re interested in helping with this work, possibly testing early versions, please get in touch: email@example.com.
Still, here goes:
- First we need to install the Redgate CI tools. Go to http://www.red-gate.com/products/sql-development/sql-automation-pack/ and download the free trial. Once installed, run the SQL Automation Pack help screen:
- In this screen, click on the middle “Open Folder” button under SQL CI. This will open the following folder:
- Separately, open Source Control Explorer in Visual Studio and do a “Get Latest Version” on the project:
We now need to copy all of the files from the sqlCI folder (opened in step 2 above) to the root folder opened above. This will leave you with something like the following:
These files now need to be added to source control and checked in. The quickest way I found to do this was from the Team Explorer window. Under the Pending Changes tab, at the bottom, under Excluded Changes, is a link showing Detected: 265 add(s):
Clicking on this opens the following dialogue and allows you to promote the file to source control:
Once promoted (by clicking the Promote button), these files can be checked in by clicking on the Check in button in the Team Explorer window. Now we need to create a Build step in TFS. Most of this is based on section 3 of the post here: http://documentation.red-gate.com/display/SAP1/Setting+up+continuous+integration+with+TFS2012. However, because there are one or two tweaks needed, I’ve reproduced a lot of this below. First, in Team Explorer (in Visual Studio), click on Builds:
Then click on New Build Definition. This shows the following screen, with the build definition name pre-populated. For now, click the “Disabled” check-box – we’ll re-enable this later:
Then click on Trigger to go to the next step and here click “Continuous Integration – Build each check in”:
At the next step (Source Settings), enter the following:
Next the Build Defaults. I’ve used a drop folder on the TFS Server called “\\localhost\Drops” (which I created on the server first):
Next, the Process step. Here, you need to use an alternative build process template. This is so that build numbers can be correctly passed from TFS through to Deployment Manager package names. There’s a blog post here about this: http://thefutureofdeployment.com/calling-rgpublish-tfs/, with links to some default templates for TFS 2010 and 2013. I’ve also added the template that I’m using to Dropbox here:
Once you have an alternative build process template, this needs to be picked up in your build definition. Click on the “Local Path:” link at the top of the Source Control Explorer window (RHS) to open the folder for the project, and navigate to the folder containing the .sqlproj file. Copy the build process template file to this location as shown below:
Then, this file needs to be added to source control and checked in. As before, I do this via the Team Explorer window. Under the Pending Changes tab, at the bottom, under Excluded Changes, is a link showing Detected: 1 add(s). Clicking on this opens the following dialogue and allows you to promote the file to source control:
Once promoted, the file needs to be checked in (by clicking Check in). Once checked in, we need to return to the build setup (that we skipped out of, earlier) and select this file. In the build setup, in the Process step, click on the New… button to add in the template:
In the next pane, select the relevant file as shown below:
Returning to the build steps, this file should now be selected:
We now need to change the “Items to Build” to point to our Redgate sqlCI project file. Click on the button next to the “Items to Build” text (above Build $/RG/EGDBProject.. ). This will show the following:
Remove this default item (the RGDBProject.sln file), and then click the Add.. button. In the resulting pane, select the sqlCI.proj MSBuild Project file from the appropriate folder. NB: you need to select “MSBuild Project files” from the “Items of type” drop-down:
Click OK, then OK to set this in the build step. Press CTRL-S to save the build template.
Setting up the mock “Staging” and “Production” Environments
In addition to the (two) development environments set up above, I’m also going to create a “Staging” environment – which is going to be automatically updated by the CI process every time a change is checked-in – and a “Production” environment which I’ll update using one-click deployments from Deployment Manager.
For this, I’m using a SQL Database on Azure, though obviously you can use any environment accessible by the TFS build agent. E.g. you can’t have a TFS server hosted somewhere, on a different domain (or on an Azure VM, as I’m using), then use your local database instance – it won’t be able to see your local instance.
On your instance (or Azure SQL Database), create two new database called DBProject_Staging and DBProject_Prod. For this, I used the following commands in SSMS against my SQL Database instance:
CREATE DATABASE DBProject_Staging GO CREATE DATABASE DBProject_Prod GO
Setting up Deployment Manager
I use Redgate’s Deployment Manager to manage the deployments from Staging to Production. NB: This is, obviously, a very over-simplified version of the setups and environments most customers have – often there are dev machines, integration machines, QA boxes, pre-production and production environments.
The following process installs and sets up Deployment Manager. I’ve installed the software on the same Azure VM that’s hosting my TFS server – this is generally no problem.
- Download and install the latest version of Deployment Manager. NB: because Deployment Manager updates are released every Wednesday, the version number is changing all the time! This also means that the screenshots below can change over time – but the principles remain the same.
I’ve not gone through the full process here of installing the product. There’s an install guide here. I’ve set DM up with a single user, “administrator”, and I’m not using Windows Authentication.
Once installed, note down the URL for Deployment Manager – you’ll need this later. For example, mine is installed on http://tfs:81/.
- Once installed, we need to set up our environments first. In the screen below, click on Create Environment – here we’re going to add in our database environment containing the Staging and Production databases – i.e. we’ll be creating two environments:
- Fill in the name (and description) for the Staging environment and click Create:
- Once the environment has been created, click on the Add SQL Server button next to your new environment. Enter the server info below – I’m adding my Azure SQL Database and using SQL Server Authentication:
- Once you’ve clicked Add, repeat steps 2-4 to also add the Production Environment and target machine (I’m using the same target machine, so just repeat the database details here). You’ll end up with the screen like the following:
- The next step is to create a Project (for the application/database we’re working on). Click on the Projects tab at the top, then “Create Project”, and enter a name for the project:
- Once you’ve clicked Create, you will have done the basic setup needed in Deployment Manager. The next step (setting up database deployment steps) requires additional information about the NuGet package names that you’ll be deploying. We’ll configure this later, once we’ve set up the CI process below to actually create the packages.
Configuring the Redgate SQL CI process
We’ve created the build definition, but we still need to configure the sqlCI.targets file to configure the MSBuild task. This file contains all of the configuration data for the CI process to work – where the databases are, what tasks to run (unit tests? Data generation? Upgrade tests? and so on), and whether to create deployment packages or not.
- From Source Control Explorer, select and open the file sqlCI.targets:
- Set the following parameters:
- Database folder path: RGDBProject\dbo
- Database server name: your database server and instance for example, <azurereference>.database.windows.net (for an Azure SQL Database)
- Target database name: DBProject_Staging
- Serial key: your Automation License serial key, or leave it blank for a trial. To use a Windows account to access the SQL server, set
- Windows authentication to true. It will use the Windows account that the build agent runs under. If you want to use a SQL account to access the SQL server, set
- Windows authentication to false and fill in the SQL user name and SQL password options.
- NuGet package name: RGDBProject
- Build number (Package version): $(BuildNumber)
- Output folder path: \\localhost\Drops\Output
- Package repository URL: http://<Deployment Manager server set above>/nuget/
- Deployment Manager URL: http://<Deployment Manager server set above>/
- Deployment Manager API key: this can be found in your DM profile page – click on your username top-right, then copy the API key from the bottom of this page
- SQL Compare command line arguments: you can include /include:StaticData here, and this will allow you to additional source control and deploy static data if you are using SQL Source Control in SSMS. This isn’t possible to do if working purely in Visual Studio with database projects.
- Save and close this file, and check in to source control repository.
- Now that we’ve configured the sqlCI.targets file, we can enable the build. Edit the build definition:
- Click on the General tab, check the “Enabled” option, then CTRL-S to save:
Trying the Build – and Debugging
This section will work through getting a build working on every database check-in. However, because there are one or two errors that can occur (if the build process isn’t set up correctly, as well as other errors), I’ve included these error messages to help others who might hit these problems – along with solutions. Of course if your environment is correctly set up from the start you won’t see these errors (and of course you could pre-empt and avoid these, by working through these resolutions first).
- Try the build. Right-click the build definition and Queue New Build… then click Queue on the subsequent panel:
- This will start the build (manually – we’ll sort out automatic queuing later). Double-clicking on the build in the Team Explorer will show the current status:
- ERROR 1 – BUILD SEEMS TO RUN, BUT NO DATABASE ARE BUILT DIAGNOSIS – EXECUTABLES IN SQL AUTOMATION PACK HAVEN’T BEEN ADDED TO SOURCE CONTROL (BECAUSE EXE FILES AREN’T ADDED BY DEFAULT)
This initial build looks as though it’s worked! But if you check the DBProject_Staging database, nothing has been added. Looking at the log file (NB: There’s a link shown to the log file in Visual Studio but because that link isn’t accessible from my client machine, I view the log files on the TFS server), the error that stands out is:
‘sqlCI.exe’ is not recognized as an internal or external command
To fix this we need to add the sqlCI.exe and other executables to source control. Open Source Control Explorer in VS, right click the project and “Add items to folder”:
In the next screen, select everything (CTRL-A), then click Next:
In the next screen, still a number of items won’t have been added. Click on the Excluded items tab at the top and again, CTRL-A to select all items. Then click the “Include item(s)” link at the bottom-right:
Click Finish. You then still need to check these items in to source control, by clicking on Pending Changes in Team Explorer and clicking Check in.
Finally, you shouldn’t need to re-queue the build at this specific point – the check-in above should re-run the build for you!
- ERROR 2 – BUILD FAILS WITH ERROR “EXEC: RedGate.SQLCI.Engine.Error: Updating database failed with exit code -2146232576” DIAGNOSIS – YOU NEED THE .NET 3.5 FRAMEWORK INSTALLED ON THE TFS BUILD AGENT You see an error something like the following:
This, not very helpful error message (we’re working on it!) indicates that the .NET 3.5 framework isn’t installed on the TFS Build agent (NB: Both .NET 3.5 and .NET 4.5 are required – the former for the sqlCI command line tools, the latter for Deployment Manager if installed on the same server).
For my server, I install the .NET 3.5 framework in the “Add Roles and Features” screen:
Once installed, re-queue the build.
- At this stage, my build worked (went green). But – did it do anything? Opening up my RGDB_Staging database on Azure, I see the following:
It worked! The Customers table has been automatically created from the Database Project file in source control.
Making Changes to the Schema then Deploying to Production
So, the initial release has gone through. But the key functionality we’re adding is the ability to continuously integrate changes from both of our devs (one in SSMS, on in VS) on the Staging environment.
First we’ll test what happens when the SSMS user makes a change the schema, then the VS user. I’ll also show what happens when a user tries to change something that breaks the update process..
- Open SSMS and connect to (localdb)\Projects. Expand the folder for the database RGDBProject_dev (your dev environment):
- At this point we want to add a new column to the table “Customers”. However, if you try to do this by right-clicking the Columns folder, then New Column, you’ll see the following error:
– it’s not possible to make designer changes to the database in SSMS.
- So instead, I run the following script to add a column:
ALTER TABLE dbo.Customers ADD [Email] NVARCHAR(255)
- This change then needs to be checked in to source control – click on “Commit Changes” on the SQL Source Control tab, enter a comment and click Commit:
- We’ve committed this change to source control. This should now kick off a build in Visual Studio which will update the RGDBProject_Staging database:
Again, this seems to have worked!
- The next stage is to use Deployment Manager to then make selected deployments to Production. If we log in to Deployment Manager, then click on the “Projects” tab, this allows us to configure the steps needed to deploy (both to Staging and Production) – click on “Steps” on the left and side in the following screen:
- Click on “Add database package step”:
- Set the database package step up as follows and click Add:
- We’ll create the first release to Staging manually. On the Projects tab, click the link to the project RGDBProject:
- Then click Create release. On the next page, wait for the “Version number” field to be populated with the most recent build from the NuGet feed (created by your CI process!) and enter some release notes before clicking Create:
- Once the release is created, it needs to be deployed to the Staging environment. Click Deploy this release:
- Leave the defaults in the next page (to deploy to Staging), and click Deploy release:
- Hopefully, if all has gone well, this release will be successfully deployed:
- If you return to the dashboard, you’ll see this showing the status of the Staging environment:
(which matches with what’s in the database).
- At this point it’s worth noting – here we’ve done a manual release to Staging – something that was also being handled by the CI process. The next stage is to also deploy this release to Production.
- Click on the release in the dashboard (here, the green box showing the release number 20140214.7):
- This opens the following screen – click on “Deploy to”:
- In the next screen, select the Production environment and click Deploy release:
- Once this has finished, return to the Dashboard to see the releases deployed to both environments:
This is as much as I will cover in this post. As I say, this setup process is a little long – and we’re working on making this considerably easier – if you’d like to help with that process, let me know on firstname.lastname@example.org.
There’s also much more that can be done, including:
- Source controlling and CI/deployment of static/reference data,
- Migrations override scripts that allow you to cover scenarios where the automated deployment process can’t pick up breaking changes – for example, table renames, column renames, adding NOT NULL columns and so on.
- Auto-deployment to environments in Deployment Manager.
And so on. But hopefully this enough to get you started running a continuous delivery process for Database Projects, using Redgate tools.