
DevOps, Continuous Delivery & Database Lifecycle Management
Automated Deployment
To maintain a stable of SQL Server databases, you need to maintain for all of those database an up-to-date build script, object-level source, database-level configuration and server-level configuration. This would form the basis of a Database Configuration Management Archive (DCMA) of SQL DDL, in source-control. Such an archive could then serve several purposes, such as monitoring any configuration or code change, recording releases, incidents, problems, known errors, and changes, or to underlie an automated deployment mechanism. If you have a database task to perform, it is good to have the right script to hand.
I will show a way of getting this underway, using PowerShell, Server Management Objects (SMO) and a third-party tool, SQL Compare Pro. In previous articles, I’ve already covered most of the basics, using purely PowerShell and SMO. However, SQL Compare has the great advantage of being able to make semantic comparisons between the scripts in source control and the live database, and perhaps to alert on changes detected in the live database, and update the source controlled scripts appropriately. Its speed of working is a bonus.
Background
I want you to imagine that you are inheriting a database mess. You are now responsible for a number of instances of SQL Server; production, staging and development and you can only guess at the state of database source-control, ‘versioning’, or configuration management. Where databases are still being actively supported in-house, the database developers may have no detectable source control, or use a source-control system you can’t access, or use a variety of Source Control system that has an incomplete record of what is current in the database. There probably exists a mass of third-party databases, abandoned work and databases whose purpose has been forgotten, due to staff changes. The third-party databases are, most likely, occasionally being updated remotely, sometimes without warning, and you only get to hear about it when something in your downstream reporting breaks. You want to know when things change on production servers.
First Steps
So what do you do? OK, you’re first likely to run sp_Blitz
to get a grip on the scale of the work ahead, but you’ve also, at the same time, got to ensure essential database documentation is is in place and can be maintained in as painless a way as possible. You need to create a Database Configuration Management Archive.
As an immediate ‘backstop’, you need to place build-level, and object-level scripts neatly within one secure repository, using a source control system. This is not a development source-control system, but a reference to all the production systems, with a record of all changes and when they happened. It is a configuration and source management system. Ideally, releases from development will be reflected in the DCMA, but at this stage we’re not taking chances.
To make a start, you need to create a central archive where the current source is kept, along with build scripts and other configuration information. This makes it reasonably easy to commit changes appropriately, especially if you are made aware of those changes. The various source control systems require that each object be scripted separately, in a separate subdirectory for each type of object. A build script is also useful, and it is necessary to have the static data that is required to build a functioning database in a hurry. You need to keep this up-to-date so that all DDL changes in all the databases are recorded in the source control system that you use.
Once this has been put in place, you can then refine the system, disentangle the knots, and categorize the scripts by the separate applications and projects. However, for configuration management, you’ll probably still want a categorization based on the server, since security requirements and configuration is likely to vary between servers. You are also faced with the fact that certain resources, such as SQL Agent, are server-based and shared between databases and applications, so you will always need server-level source and configuration management.
A lot of this is possible with SMO alone, but it would require that every scheduled execution of the script-generation process would result in a complete new set of scripts, whether they’d changed or not. We will need to supercharge the process using SQL Compare Pro, since it takes on a daunting amount of the processing. Together, SMO and SQL Compare Pro will cover most bases, along with quite a bit of PowerShell scripting.
Using a third-party tool
In a series of articles on Simple-Talk, I’ve illustrated how to use PowerShell and SMO together to script out everything that is necessary to store in a Database Configuration Management Archive (DCMA), and so I won’t repeat it here; see the references at the end of the article.
I use SQL Compare Pro as well, because it is faster, does full semantic comparisons, and generates or synchronization scripts. Earlier versions of the tool only compared live databases, it isn’t so well known that SQL Compare Pro is just as happy finding the differences between script directories, SQL backups or ‘SQL Compare Snapshots’ (not the same as SQL Server Snapshots.) This means that you don’t even need to access live production servers, but merely use backups to check for DLL changes. SQL Compare Pro can also take instructions from an XML command file, or from a project file, which comes in very handy when your repository becomes more complex because you can specify all the details of the process on a per-database level. Once you have this system in place, it is relatively easy to enhance it to meet your particular requirements.
Using a command-line tool with PowerShell
There really isn’t a problem in using a DOS command-line file in PowerShell. You can, in fact, use the command tail (args) just as you would in a batch file. However, I would advise you to use the ‘splat’ technique instead because you can lay it out with plenty of comments, and there is less chance of the command tail being misinterpreted by the executable file.
Here is a simple example of using SQL Compare Pro with the ‘splat’ technique. If you run this code in PowerShell, setting the appropriate path to the SQL Compare executable, as well as instance, database and directory names, then SQL Compare will script out the referenced database into the appropriate directory, creating subdirectories for each type of database object.
1 2 3 4 |
& 'C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe' @( '/server1:MyServerInstance', #The SQL Server instance '/database1:AdventureWorks', #The database to script out '/Makescripts:D:\MyDatabases\AdventureWorks') # and the directory to put it in |
SQL Compare actually allows you to have your parameters within an XML file that it reads in. It is a good safe way of doing it, but it is unlikely you’d need to do that at this stage of automation.
Creating and maintaining a scripts directory
First, we need to create a new version control repository, which will be the home for our Database Configuration Management Archive. I use GitHub, but most source control versions work in a similar way. I created a new git repo called CurrentDatabases, located at D:\CurrentDatabases. We’ll script out AdventureWorks
, which is on a SQL Server instance called MySQLServerInstance
, and which uses Windows authentication.
At the start, we’ll need SQL Compare’s ability to create, from a database, a script directory that contains object-level scripts for all the objects, tables, views, functions and so on.
Making the initial object-level scripts
1 2 3 4 5 6 7 8 |
$SQLCompare = 'C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe' # full path $MyServerInstance = 'MySQLServerInstance' #The SQL Server instance $MyDatabase = 'AdventureWorks' #The name of the database $MyDatabasePath = "D:\CurrentDatabases\$MyDatabase" #Where to put the scripts #------ $AllArgs = @("/server1:$MyServerInstance", "/database1:$MyDatabase", "/Makescripts:$MyDatabasePath") & $SQLCompare $AllArgs |
Now we have the directory with all the individual object scripts in place, we can then compare this with the live system.
Synchronizing object-level scripts with a database
This command will update any object scripts that have changed. Make a small modification to one of the Adventureworks objects and run the following command, and SQL Compare will update the appropriate DDL script in our DCMA, but at this stage it won’t alert you that a change has happened.
1 2 3 4 5 6 7 |
$SQLCompare = 'C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe' # full path $MyServerInstance = 'MySQLServerInstance' #The SQL Server instance $MyDatabase = 'AdventureWorks' #The name of the database $MyDatabasePath = "D:\CurrentDatabases\$MyDatabase" #Where the scripts are $AllArgs = @("/server1:$MyServerInstance", "/database1:$MyDatabase", "/scripts2:$MyDatabasePath", '/synchronize', '/quiet') &$SQLCompare $AllArgs |
We like to have the individual object scripts, but would like the entire build script as well, so that we can create a new copy of the database easily.
Creating a single build script
We do this by comparing the database to a blank database on the same server, (which is what the model
database really is) and produce a script that, if executed within a new database, creates the entire database.
1 2 3 4 5 6 7 8 9 |
$SQLCompare = 'C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe' # full path $MyServerInstance = 'MySQLServerInstance' #The SQL Server instance $MyDatabase = 'AdventureWorks' #The name of the database $MyDatabasePath = "D:\CurrentDatabases\$MyDatabase" #Where to put the script $AllArgs = @("/server1:$MyServerInstance", "/database1:$MyDatabase", "/server2:$MyServerInstance", '/quiet', "/database2:model", "/scriptfile:$MyDatabasePath\$MyDatabase.sql") if (Test-Path"$MyDatabasePath\$MyDatabase.sql") { Remove-item"$MyDatabasePath\$MyDatabase.sql" } &$SQLCompare $AllArgs #Create a complete build script |
Comparing a live database with a script directory
If you wish to check a database against a script directory that has been created by SQL Compare, then you can do this. It just tells you that there have been changes.
1 2 3 4 5 6 7 8 9 |
$SQLCompare = 'C:\Program Files (x86)\Red Gate\SQL Compare 11\sqlcompare.exe' # full path $MyServerIinstance = 'MyServerInstance' #The SQL Server instance $MyDatabase = 'AdventureWorks' #The name of the database $MyDatabasePath = "E:\MyScriptsDirectory\$MyDatabase" #Where to put the script $AllArgs = @("/server1:$MyServerIinstance", "/database1:$MyDatabase ", "/scripts2:$MyDatabasePath", '/assertidentical') &$SQLCompare $AllArgs if ($?) { 'Databases the same' } elseif ($LASTEXITCODE = 79) { 'Databases not identical' } else { 'we had an error!' } |
If it detects changes it produces a return code of 79.
Updating a script directory from a live database (updating only the changed files)
Lastly, you might want to synchronize the database with what is in the repository and produce a report of changes, overwriting the current report.
1 2 3 4 5 6 |
$SQLCompare='C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe'# full path $MyServerInstance='MySQLServerInstance'#The SQL Server instance $MyDatabase='AdventureWorks' #The name of the database $MyDatabasePath="D:\CurrentDatabases\$MyDatabase" #Where to put the script $AllArgs = @("/server1:$MyServerInstance", "/database1:$MyDatabase ","/scripts2:$MyDatabasePath", '/q', '/synch',"/report:$($MyDatabasePath).html", "/reportType:Simple","/rad","/force" ) &$SQLCompare $AllArgs |
Of course, if you were maintaining a single build script as well as object-level scripts, you’d have to regenerate that too, by overwriting it, as described above in the Creating a single build script section.
There is a great deal more you could do, of course, using SQL Compare, and/or PowerShell. Just by tweaking the parameters to SQL Compare, you can, for example, compare to uncompressed backups rather than the live system, create SQL Compare Snapshots, or save the ‘project’ settings from the SQL Compare User-Interface, and run SQL Compare from the project, specifying it in the command line.
Once we are scripting this in PowerShell we can then add functionality. As part of this process, I like to script out the server scripts and the agent scripts, as I explained here in Database Deployment: The Bits – Agent Jobs and Other Server Objects. If you are maintaining a test cell, for example, you might choose to keep all the test databases in line with the current source version, using SQL Compare. Here, we are more interested in aligning the version in the repository with what is in the live database.
With these basic operations we have the start of a rudimentary Database Configuration Management System.
The Script
The script will work through a list of servers and script into directories all the non-system databases, along with any others we don’t want scripted, specified in an Database
ExclusionList
. If the database has yet to have its initial script placed in the archive, it does so. Otherwise, it checks by comparing what is already in the script with what is in the database. If it is the same, then there is nothing to do beyond logging that it has done the check along with the date and time, otherwise it does whatever is required when a change is detected. As an example, the script updates what is in the DCMS source archive, which is probably going to be necessary anyway. It also creates an HTML report of the changes detected which, in a live system, would be emailed. (I give a reference on a way to do this at the end of the article)
Firstly, you need to have a list of servers with which the script will work. I like to maintain my servers in SSMS as either local server groups, or Central Management servers, if I need to share the list with a team. The way of getting at them is pretty similar. However, there is no harm, if it suits you, in having a list of servers as a PowerShell array, or just in a simple ASCII file, preferably stored in source control, and read at run-time by the PowerShell script. With PowerShell, you are in control.
In this example, I’ll use the servers registered in my Local Server Groups. To make things more complicated, I’ve set these up in groups for a number of purposes such as the backup regime, so the actual servers are duplicated. I’ll be de-duplicating them within the scripts because we only want to do them once.
This script would be on the Windows Scheduler and would run at a time interval that suits your circumstances.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 |
<# now before you run this script, you need to set all these variables to something meaningful for your system. #> #this should be the full path to your copy of SQL Compare. $SQLCompare = 'C:\Program Files (x86)\Red Gate\SQL Compare 11\sqlcompare.exe' # full path #needed if you use SQL Server authentication $DefaultUserName = 'JohnDoe'; # the default name for any SQL Server Authentication #this will be where you put all the sourcecode for all the databases $PathToScripts = 'D:\CurrentDatabases' # base of config management file structure #the list of all the databases you DONT want to process $DatabaseExclusionList = ('master', 'model', 'msdb', 'tempdb') # any databases we don't want #The list of all the servers you don't want to do $ServerExclusionList = ('') # any servers we don't want #you may use a different language. $messagePrompt = 'Please can we have the username and password for' # Trap { # Handle the error $err = $_.Exception write-error$err.Message while ($err.InnerException) { $err = $err.InnerException write-error$err.Message }; # End the script. break } set-psdebug-strict $ErrorActionPreference = "stop" # If (!(Test-Path SQLSERVER:)) { Import-Module "sqlps" -DisableNameChecking } If (!(Test-Path SQLSERVER:)) { throw "Cannot load SQLPS" } # load the SQLPS module if it isn't already loaded. If (!(Test-Path $SQLCompare)) { throw "Cannot find the specified version of SQL Compare" }; If (!(Test-Path $PathToScripts)) { $null = New-Item -ItemTypeDirectory -Force-path$PathToScripts #maybe create path }; #now we get all the servers/instances out of the database server group #in a working system this would probably be a server list from the CMS get-childitem'SQLSERVER:\sqlregistration\Database Engine Server Group' -recurse | where { $_.GetType() -notlike '*ServerGroup*' } | #exclude directories! % { $_.ServerName } | sort-object | get-unique | #you may have aliases foreach-object { #for each server, we make a connection #we need this to get the list of databases to do $conn = new-objectMicrosoft.SqlServer.Management.Common.ServerConnection $_ #if this was windows authentication, then the connection is actually made #there is an SMO 'GetConnectionObject' method on the objects returned by #get-childitem, but I haven't figured out how to use it AND make servername #unique $RGuser = $null; $RGPassword = $null #set these back to null since we re-use them if ($conn.TrueName -eq $null) #then the automatic connection wasn't made { #if so we require a SQL Server login. $theCredentials = Get-Credential -UserName "$DefaultUserName" -Message "$messagePrompt $_" if ($TheCredentials -ne $null) { #unless he cancelled $conn = new-objectMicrosoft.SqlServer.Management.Common.ServerConnection ($_, $theCredentials.UserName, $theCredentials.password) $RGuser = "/username1:$($theCredentials.UserName)" #we need this for the application $RGPassword = "/password1:$($theCredentials.GetNetworkCredential().Password)" } } $srv = New-ObjectMicrosoft.SQLServer.Management.SMO.Server $conn $TrueServerName = $conn.truename #this will be null if the connection still hasn't been made $TheServerIinstance = $conn.ServerInstance if ($TrueServerName -ne $null -and $ServerExclusionList -notcontains $TrueServerName) #if we definitely have a connection and it is a server we really want to 'sourcify' { $operation = 'did something to'; #for reporting purposes $InstancePath = "$PathToScripts\$($TrueServerName -replace '[\\\/\:\.]', '-')\" # we put it in a directory using the true server name $srv.databases.name | Where-Object { $DatabaseExclusionList -notcontains $_ } | #don't use any DBS that is in the exclusion list foreach-object { # for each database, barring exclusions $DatabasePath = "$($InstancePath)$($_ -replace '[\\\/\:\.]', '-')" #make a legal pathname $DatabaseName = $_ # $_ is a bit fleeting so remember it! # place them in directories under each server #each database has its own directory. if (Test-Path $DatabasePath) #then compare with the database (update file only if a change) { "Checking for changes at $(Get-Date -FormatF)" >>"$($DatabasePath).log" $operation = 'Checked for changes in'; $AllArgs = @("/server1:$TheServerIinstance", "/database1:$_", "/scripts2:$DatabasePath", '/q', '/assertidentical') } #SQL Compare returns exit code 0 if objects compared are identical. If not then exit code 79 else#just script it out, creating the folder { $null = New-Item -ItemTypeDirectory -Force-path$DatabasePath #create path if it doesn't exist "Writing First Archive at $(Get-Date -FormatF)" >>"$($DatabasePath).log" #write to the log $operation = "Created a new archive in $DatabasePath for"; $AllArgs = @("/server1:$TheServerIinstance", "/database1:$_", "/Makescripts:$DatabasePath") } if ($RGuser -ne $null) { $AllArgs += $RGuser } #if a different user was specified if ($RGpassword -ne $null) { $AllArgs += $RGpassword } #if a password was specified "$operation $DatabaseName on $TrueServerName" &$SQLCompare $AllArgs>>"$($DatabasePath).log" if ($?) { 'Operation successful'>>"$($DatabasePath).log" #write to the log "'successfully $operation $($_) at $(Get-Date -FormatF)" >>"$($InstancePath)Compare.log" } else { #Hmm. An error code was passed back. Should we look at it? $TheExitCode = $LASTEXITCODE switch ($LASTEXITCODE) { #what's the fuss, mate? 1 { $e = 'General error code' } # - General error code 3 { $e = 'Illegal argument duplication' } # Some arguments must not appear more than once in a command line. 8 { $e = 'Unsatisfied argument dependency' } <# There is an unsatisfied argument dependency or violated exclusion when the command line is run. For example: /arg2 depends on /arg1 but you have specified /arg2 without specifying /arg1 /arg2 can't be used with /arg1 but you have used both #> 32{ $e = 'Value out of range' } # The numeric value supplied for an argument is outside the range of valid values for that argument. 33{ $e = 'Value overflow' } # The value supplied for an argument is too large. 34{ $e = 'Invalid value' } # The value supplied for an argument is invalid. 35{ $e = 'Invalid license' } # Software license or trial period has expired. 61{ $e = 'Deployment warnings' } <# SQL Compare encountered serious warnings that apply to the deployment. If you're using SQL Compare, you can ignore these warnings by specifying /AbortOnWarnings:None If you're using SQLCI, you can ignore these warnings by specifying /additionalCompareArgs=/AbortOnWarnings:None For more information about /AbortOnWarnings, see Switches used in the command line.#> 62{ $e = 'High level parser error' } <# SQL Compare encountered high level errors when parsing a scripts folder. Use /IgnoreParserErrors to force SQL Compare to continue without exiting. The Error Parsing Scripts dialog box in the SQL Compare user interface provides additional information to help you resolve script parser errors.#> 63{ $e = 'Databases identical' } <# The databases being compared are identical or no objects have been included. To suppress this error use /Include:Identical #> 64{ $e = 'Command line usage error' } <# The command line was used incorrectly. For example, an incorrect flag, or incorrect syntax may have been used.#> 65{ $e = 'Data error' } # Data required by SQL Compare is invalid or corrupt. 69{ $e = 'Resource unavailable' } # A resource or service required to run SQL Compare is unavailable. 70{ $e = 'An unhandled exception occurred' } # See the log for more details. 73{ $e = 'Failed to create report' } # The report was not created. 74{ $e = 'I/O error' } <# This is returned if SQL Compare attempts to write to a file that already exists, and the /force switch has not been set.#> 77{ $e = 'Insufficient permission' } # The action can't be completed because the user does not have the necessary permission. 79{ $e = 'Databases not identical'; 'Changes detected'>>"$($DatabasePath).log" <# This is returned when the /assertidentical switch is used and the source and target are not identical. If we have compared the database with the version in the archive and there are changes then we need t do somthing, and this will depend on circumstances. You'd likely want to send out an email here and redo the single build script if you are maintaining one#> # we choose just to update the source code, and generate a report showing the differences "Synchronising source with database at $(Get-Date -FormatF)!" >>"$($DatabasePath).log" $AllArgs = @("/server1:$TheServerIinstance", "/database1:$DatabaseName", "/scripts2:$DatabasePath", '/q', '/synch', "/report:$($DatabasePath).html", "/reportType:Simple", "/rad", "/force") if ($RGuser -ne $null) { $AllArgs += $RGuser } #if a different user was specified if ($RGpassword -ne $null) { $AllArgs += $RGpassword } #if a password was specified "Synchronized $DatabaseName on $TheServerIinstance" &$SQLCompare $AllArgs>>"$($DatabasePath).log" } 126{ $e = 'SQL Server error' } # Execution failed because of an error. 130{ $e = 'Ctrl-Break' } # Execution stopped because of a Ctrl-Break. 400{ $e = 'Bad request' } <# The command line arguments can't be executed. For example, you may have provided two mutually exclusive switches.#> 402{ $e = 'Not licensed' } <# There is no acceptable license. If you have one, use /activateSerial:<SerialKey>. If you don't have a license, please contact sales@red-gate.com for a either a trial extension key or to purchase a license.#> 499{ $e = 'Activation cancelled by user' } # Activation was cancelled because the cancel button was pressed during the process. 500{ $e = 'Unhandled exception' } Default { $e = "application returned unknown error code: '$LASTEXITCODE'" } } if ($TheExitCode -ne 79) { "SQL Compare error $LASTEXITCODE : $e at $(Get-Date -FormatF)" >>"$($DatabasePath).log" "SQL Compare error $LASTEXITCODE : $e accessing $($_) at $(Get-Date -FormatF)" >>"$($InstancePath)Compare.log" } } } } } |
I’ve tried to keep this script reasonably simple for publication so as not to provide too many distractions from the principles of the operation. There are complications that you might need to code for such as specifying a case-sensitive comparison for databases that are case-sensitive. You might also need to maintain exclusion lists or whitelists for each server as well. There are a great variety of ways of reporting progress of the scripts and I’ve just provided a sample. I like to get an alert where databases have changed, but it is easy, I hope, to see where to insert this. Another thing that is easy to add is the script to create the entire build script, rather than the object-level sources for a new entry in the repository and update this whenever there is a change.
Conclusions
In the past, I’ve always found it a chore to maintain a repository of all the source code for all the databases in my care, as part of a Database Configuration Management System. It meant much boring repetitive work with SQL Server Management Studio (or Enterprise Manager before then) or using VBA scripts with DMO/SMO via COM automation. PowerShell makes it all so much easier.
From the perspective of the DBA, a configuration repository is rather different from development Source Control, and has a broader perspective on the server. It understands the setting and context of the database, all the different security settings and permissions, and role of each installed database in the database lifecycle (e.g. test, integration, staging and production).
With many databases being outside the realm of the developers, provided as turnkey systems by third parties, from other distant parts of the organization, or from long gone development teams, we need a separate resource, though with full communication with the development teams, where possible. A great deal can be achieved purely with PowerShell and SMO but for the full works, including alerting on metadata changes, and comparison to backups rather than live servers, I like to use a database metadata comparison tool, such as SQL Compare Pro.
When work needs to be done, especially in a hurry, it is good to be able to quickly get ones hands on the build scripts for code to understand what is going on with a database and its settings. If integrated with a source control system, it is even better to be able to see what has changed over time. For doing the various processes of database release and deployment, it is very handy to keep permissions separate from code to fit better with the differing requirements of the different environments of database servers, and a repository like this is a great place to keep the scripts to do this.
References
For background on a database CMS, see
- Database Configuration Management for SQL Server by William Brewer.
For details of some of the routines you might wish to add to your system see
- Database Deployment: The Bits – Agent Jobs and Other Server Objects for the way of scripting out some of the more important server objects
- Database Deployment: The Bits – Database Version Drift for checking for database metadata changes without requiring a database schema comparison tool such as SQL Compare.
- Registered Servers and Central Management Server Stores for more on using these for storing lists of servers.
- Building a Daily Systems Report Email With PowerShell for an explanation of how to send an email report via PowerShell
Some Jargon
- Database Configuration Management Archive
- A central place where a team can accress essential materials and scripts; whether SQL, PowerShell, XML, lists, documentation for managing databases at various parts of the database lifecycle. It should be able to track changes to any script and have access control
- Database build script
- This builds an entire database from one script, allowing the build to take place in dependency order using one batch, and giving the potential to wrap in in a transaction.
- Database Object-level script
- This can refer to a single script for a database object such as a table, procedure or view but also refers to a directory of such scripts, with each type contained in separate subdirectories
- Object-level script Directory
- A file directory, containing subdirectories for each type of object-level script
- Database Configuration Management
- The art of maintaining and keeping current the copies of an application database for various purposes such as development, staging, production, and testing. This may involve assisting the developers to perform rapid releasing and deployment, recording releases, incidents, problems, known errors,ensuring that databases adhere to the agreed production control procedures, detecting unauthorized changes and ensuring rapid recovery from disasters.
- Database-level configuration
- This will include the scripts for setting all the database-level configuration itens such as collation, as well as server-specific variables for the database such as the drives in which to place logs or data
- Server-level configuration
- This will have the script for the server settings and the shared server-level resources such as the SQL Agent and Logins
- Semantic source comparisons
- one can compare text and detect changes in the text of the source. Semantic comparisons will only register a change if it is a meaningful change that changes the resulting code.
- Versioning
- This means adding the discipline of attaching unique IDs to a version of a system so that the correct level of all the components can be included in the build of an application. Database versioning means attaching an ID to a database that is present, and readable, in code and accessible by querying the live system.
- Database Synchronization
- This really means making one database, or script directory, like another version of the same database, or script directory. Although the changes can flow either way, there seem to be no systems that flow both ways at once with mutual sharing of new features.

DevOps, Continuous Delivery & Database Lifecycle Management
Go to the Simple Talk library to find more articles, or visit www.red-gate.com/solutions for more information on the benefits of extending DevOps practices to SQL Server databases.
Load comments