As working patterns change dramatically in 2020, we see an increasing number of customers and community members shifting components of their environment into the cloud.
The 2020 State of Database Monitoring Report began collecting responses in April 2020, in the midst of the global COVID-19 pandemic. Respondents reported an increased use of cloud platforms, but also indicated that migrating to and monitoring cloud environments are the most difficult themes they face over the next year.
Whether you have implemented database devops fully or are just starting to think about how devops might improve your database development and delivery, it is essential to think through what can be easily transformed when moving to the cloud, and what is best to be “lift and shifted.”
Here are some essential practices to consider when planning a cloud migration strategy.
Don’t lift-and-shift legacy development database patterns by default
One of the biggest pitfalls with database development is having legacy shared database development environments, which are difficult to refresh and prone to causing errors in development. As I wrote in the post, “Future Proofing: Database DevOps Now and In Two Years,” it’s highly desirable to move toward functionality where developers may provision databases on-demand for short-term use, following an infrastructure as code pattern.
A couple of factors are also relevant here:
- It’s often easier to experiment with an modernize development environments (compared to production)
- Cloud migration projects often have a time period where both older and newer development environments may coexist, allowing developers to validate that the new environment will work for them without downtime
We hear from many of our customers who are planning cloud migrations that they plan to take advantage of more automation in the cloud for scalability. This is a valuable area in which to explore changes, no matter which cloud or clouds you use.
Use version control to your advantage
If your database code is already in a Version Control System, that centralized “source of truth” for code changes will save you a lot of headaches in terms of cloud migration. In this case, the main thing to consider is whether you plan to migrate your VCS system, and the network access needed from all of your environments.
For databases which do not have their source code in a VCS, you have some additional risks and considerations:
- Migrating development environments often leads to confusion about which environment is in use and which code has or should be shipped to production. It is not uncommon for accidental use of an old development environment to cause problematic deployments.
- Migrating production environments often leads to changes in performance due to differences in hardware, infrastructure, and other changes. For critical systems, this often requires hotfixes shortly after a migration — without the use of source control it can be difficult to track these changes and related issues. In the worst cases, it is hard to know what changed after migration and why.
You can mitigate these risks with a couple of actions:
Automate regular daily snapshots of your production database code and store these snapshots in a Version Control System. While this doesn’t provide you with the full benefits of using a VCS to drive development and delivery, it can be invaluable in documenting what the state of the code was up until migration and then in the days following the migration. This can save you many expensive hours when troubleshooting incidents. (Note: this can also be a good initial step towards using a VCS for your database code.)
Perform database cutovers in a way that does not allow further activity on the “old” databases. For production databases, this ensures that no more data can be written to the database following a migration, which is often an essential. For non-production databases, this ensures that those who missed an email don’t continue to use development systems at risk of drifting from production.
Identify the right tooling to support you wherever you move
Some good news here: the tooling you currently use may work with more cloud environments that you think.
An example is that we at Redgate haven’t historically marketed ourselves as a “cloud tools” company in a significant way — but our productivity and Database DevOps tooling natively has excellent compatibility with Azure, AWS, and Google Clouds for both PAAS and IAAS databases.
Our teams at Redgate have always wanted our DevOps tooling to be portable, so our CI/CD functionality is built to be deployable via scripts in the environment of your choice. This means it can be run by a build server in Google Cloud just as well as it can be run on prem, in Azure, or in AWS. We continue to invest in innovations to improve portability, such as the ability to run SQL Compare on Linux via a Docker container.
Similarly, our productivity tooling is built to work in the same client IDEs which developers and DBAs use in any environment.
In other words, it’s no longer necessary to shop specifically for “cloud tooling.” The use of the cloud and hybrid environments are now so ubiquitous that you can speak with your favorite vendors about their compatibility and support for the clouds you plan to use.
Want to learn more?
- Redgate Advocate Grant Fritchey has published a host of cloud content onto the Redgate Advocate Playlist on YouTube
- Watch “The latest Database DevOps techniques in AWS” to see the latest and greatest tech from Redgate and Octopus Deploy
- The webinar “Monitoring in the Cloud: Managing Migrations and Performance” talks more about migrations and focuses in on monitoring your estate
Was this article helpful?