Deploying cross-database dependencies to different environments

The Redgate DLM Automation team that we’re part of recently took time out from development to explore some of the issues our customers face when automating deployment of database changes.  As part of that exercise, we took a closer look at cross-database and cross-server dependencies – these can cause problems when deploying databases to multiple environments which might be configured differently. This blog post shares some of our findings.

To better understand the issues, we created a test application we could work with; this was a system for a fictional company called Lightning Works. We used DLM Automation and Octopus Deploy to deploy two databases, a website, and associated services. The two databases were our Foreign Exchange database, Forex, and the LightningWorks database used to hold the application data.

The cross-database dependency problem appeared in the procedure UpdateRatesFromForex in the LightningWorks database. This was because it had a hard-coded reference to a function in the Forex database:

On our local development machine, these databases were on the same server and there was no problem. Unfortunately in Quality Assurance (QA), they were on separate servers and so executing the procedure failed. We came up with several possible solutions:

  • Use linked servers and synonyms
  • Use linked servers and always reference using a four-part identifier
  • Pull out the dependency from the database and into the application

We agreed that pulling out the dependency was the better solution. This would separate the databases, making them easier to maintain and develop independently – but we also agreed that it was not always possible to do this.

We then considered using linked servers and a four-part identifier. The four-part identifier would need to be set so that in each environment, the same server name would point to a different server. This could be done using a different data source name configuration in each environment. However, we decided that this solution would not give us the flexibility that using synonyms offered. This is because, when using synonyms, only the synonym needs to know details of the cross-database dependency, rather than having the four-part identifier duplicated everywhere the cross-database dependency might occur.

Once we’d decided to use synonyms, we had to think how to include linked servers and synonyms in the deployment process. We decided to start by making these changes manually. Then, once we had a solution that worked, we would look at automating these changes.

The first thing we did was to add a synonym, dbo.GetLatestExchangeRate, in LightningWorks on my development server. This synonym pointed at the function GetLatestExchangeRate in the Forex database. We used this synonym in the UpdateRatesFromForex procedure instead of the hardcoded four-part identifier reference to the function. This worked locally so we deployed it to QA for testing. In QA we added a linked server with the option ‘RPC Out’ set to ‘True’. This allowed us to remotely call the GetLatestExchangeRate function. We manually edited the synonym to use the QA server instead of the development server. When we ran UpdateRatesFromForex we got the following error:

We went to the msdn synonyms page. This said “You cannot reference a synonym in a DDL statement”. To fix the issue in QA, we went back to our development database and changed the GetLatestExchangeRate function. We turned it into a stored procedure with output and made the corresponding change to the caller. This also required changes to the application, which relied on the same function. To keep track of these changes, we pushed it through source control and deployed to QA. Another option would have been to produce some dynamic sql and use OPENQUERY, which would not have required any changes to the Forex database.

The modified UpdateRatesFromForex procedure now looks like this:

After we deployed this change to QA, calling the UpdateRatesFromForex procedure succeeded. However, all connections to the linked server were as sysadmin. This goes against the principle of least privilege, which states you should only have permissions for what you need to do. This would be of particular relevance in industries such as financial services where segregation of duties is paramount.

We decided that our next job was to reduce the permissions used by connections from the LightningWorks server to the Forex server. We did this by creating a new user with the role GetRates in the Forex database, with an associated SQL login, GetLatestRates. The GetRates role only had permission to run the GetLatestExchangeRate procedure.

The first deployment to QA we tried failed. This was because DLM Automation ignores user role membership by default, so the new user wasn’t part of the GetRates role in QA. To fix this, we had to remove the “-IgnoreUsersRolesAndPermissions” option from our call to DLM Automation.

With this change made, we deployed again to QA. The password for the GetLatestRates login was set to the default of ‘p@ssw0rd’, so we manually updated it to something more secure. We then set up the linked server to use this new user, which reduced security concerns around the dependency.

So far so good. We had it working manually, but the chances of a deployment failing due to human error was a concern. We now wanted to work out how to automate these changes, so we listed the changes we had to make:

  • Set a secure password for the GetLatestRates login
  • Create a linked server with the correct server name and database name on the server containing the LightningWorks database
  • Set the RPC Out option to ‘True’ for the linked server
  • Edit the linked server to use the GetLatestRates login when connecting to Forex
  • Update the synonym to use the correct server and database name in each environment

We then wrote simple scripts locally that would make these changes. We tested the scripts and made them part of our deployment process in Octopus Deploy. These scripts needed slight changes specific to each environment, so we used Octopus Deploy variables to set different values depending on the environment. These variables are referenced in the scripts using “#{VariableName}”.

On Forex, we ran the script:

On LightningWorks, we ran the script:

We deleted the manual changes we made to QA and ran a deployment to QA with the new scripts. This was successful. We finally deployed to production, where the app and databases worked as expected using the cross-database dependency.

Our solution of using linked servers and synonyms had worked. But having to make all these changes had confirmed our original thought: pulling out the dependency from the database into the application would have been a much better choice. There are a number of reasons for this:

  • We had to make changes to Forex. But this may not always be possible, especially if other applications rely on it
  • Adding a cross-database dependency from LightningWorks required changes to the users and permissions in Forex
  • The deployment process was brittle and would need updating every time we change the synonyms or linked server users
  • We had to run our update scripts after the main deployment of the databases had finished

Had we not used synonyms, and instead used four-part identifiers, then we could have removed the deployment step updating the synonyms. This would make the deployment process slightly less brittle. We now think synonyms or four-part identifiers are equally valid solutions, but we chose synonyms for the added flexibility.

Based on our investigation, we recommend that if you have a cross-database dependency that is causing a problem during deployment, you should try to pull out the dependency into the application. If this isn’t possible, then using linked servers is another option. But be aware that we were only able to do this because we had access to both databases in all environments. This meant we were free to make changes to users, stored procedures or any other problem points. If you don’t have this level of access, pulling out the dependency may be your only option.

If you have come across this problem before, or you think there is a better way to solve the problem, please leave a comment.

Tools in this post

DLM Automation

Continuous integration and automated deployments for your SQL Server database.

Find out more

Share this post.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

Related posts

Also in DLM Automation

Unearthing Bad Deployments with SQL Monitor and Redgate's Database DevOps Tools

Sudden performance issues in SQL Server can have many causes, ranging all the way from malfunctioning hardware, through to simple misconfiguration, or perhaps just end users doing things they shouldn'...

Also in Hub

Misuse of the scalar user-defined function as a constant (PE017)

Do not use a scalar user-defined function (UDF) in a JOIN condition, WHERE search condition, or in a SELECT list, unless the function is schema-bound. Scalar UDFs are often used without a parameter to...

Also in Product learning

Customizing the SQL Prompt built-in snippets: a better 'ata' snippet

Snippets are a great feature of SQL Prompt. They save coding time, and introduce standards and consistency to the way you build code modules. They have multiple replacement points (placeholders) for p...

Also about automated deployment

5 tips for achieving continuous delivery

If you’re struggling to set up a reliable, repeatable release process you’re not alone. The good news is that most of the problems you’ll encounter have been solved before.

There are many smart...

Also about DLM Automation

Database CI with Redgate DLM Automation, PowerShell and Jenkins

While the practice of Continuous Integration (CI) started with application code, you can apply the same principles to databases. Database CI is the process by which we build, test and deploy the datab...