The Two Ways Containers Will Revolutionize Database DevOps

Comments 0

Share to social media

Table of Contents

Revolution 1 – Database Provisioning as a Service

Revolution 2 – Portable Database DevOps Automation Tooling for CI/CD

What will the Results be for CIOs and Shareholders?

Revolution 1 – Database Provisioning as a Service

Containers will make good on a dream that engineers have had for a long time when it comes to working with databases: developers want the ability to easily and quickly instantiate writeable databases for experimentation and validation on-demand, then dispose of the databases when they are no longer needed.

This isn’t a new idea, so you might wonder…

Why didn’t this happen earlier with SAN automation and virtualization?

I know two things to be true:

  1. I’ve been hearing about self-service database provisioning for at least 10 years from SAN vendors and virtualization sales teams
  2. Both medium sized businesses and Enterprises spend a lot of money on fancy SAN storage and virtualization

But yet hardly anyone has true self-service database provisioning. The dream was not described as “create a ticket and maybe you get something after a couple of weeks.” And yet, that’s what most people have — or even worse, they are expected to use the same old stale shared development environments because they can’t even get something in a couple of weeks.

My observations lead me to believe that the primary reason that SAN and virtualization technology weren’t able to deliver the promise of self-service environment is strong silos between IT and Development groups. SAN and virtualization tech are almost exclusively sold to IT groups. In many enterprises these still function largely as cost centers, where purchases are more about fulfilling basic needs of customers within the organization. These needs are typically the tech version of the “base” of Maslow’s hierarchy of needs: the minimum is implemented to keep teams alive.

Not that I didn’t write that the minimum was purchased.

That’s the sad bit about this. When I talk to folks, I often find that their organizations have licensed and implemented SAN and virtualization tech which is capable of all manner of self-service technology, however it’s not available for development teams to use due to existing processes, red tape, and resistance to change.

The costs of implementing self-service technology isn’t simply related to the tech, after all. It also costs a significant amount of time and — most importantly — influence to modify entrenched patterns and the human resistance to change.

 

Why do containers change this? They aren’t inherently stateful.

There are a lot of great things about containers, but they don’t automatically solve the problem of provisioning writeable database environments. In other words, while you can very quickly spin up a container running a database engine such as Postgres, SQL Server, etc., you need a bit of extra magic for that instance to have a database with actual data in that container. Sure, you can use old-fashioned “restore” techniques to bring a copy of a backup online in that container, but with a database of any size that means waiting around for the restore to finish.

The magic with provisioning-on demand doesn’t involve waiting. Databases on-demand truly pay off when you can spin up environments rapidly enough to:

  • Perform an experiment with a fresh branch in source control quickly, using minimal resources — maybe even pulling down someone else’s branch and applying it to a realistic new environment
  • Create databases quickly for CI/CD pipelines to automatically deploy database code to as part of Pull Request workflows
  • Quickly reset QA and Test environments whenever the order of releases change or a hotfix passes by in order to validate that all changes work properly with a modified deployment order

Containers don’t enable this by themselves, but they are flexible enough that cloud services and new tooling from companies like Redgate opens up new options for quickly creating containers with databases in them.

The key differentiation from what is happening now with containers is that:

  1. The movement to use containers is largely developer driven
  2. C-level executives are very interested in container technology, and are looking to developers and DevOps culture to unlock the potential
  3. The ability to quickly provision databases in the cloud has raised the stakes

Late last year, I attended a Gartner Enterprise Architecture Summit in Florida. I attended a session called “Containers for the CIO” out of curiosity — after all, the title of this talk even seems like an anti-pattern. Do CIO’s really need to know about such specific technical implementation details as containers?  The room was large enough for ~250 people, every seat was taken, and people were standing around the walls of the room as well, because it turns out CIO’s want  to know about containers and Kubernetes, whether they need to know the details or not.

But unlike the virtualization revolution of the past, the container revolution is largely recognized as needing significant investment from both development and operations to succeed.

And this is the difference which matters. This is what it will take for true smart provisioning for the database to become common.

 

Steve Jones shows what the future looks like…

Redgate’s Steve Jones discusses the future of database development and shares a demo of what this looks like in containers in this talk from the Tech Community Day 2020 event.

 

What on-demand database options does Redgate provide?

Redgate has proven cloning technology for SQL Server databases and we have an early preview of cloning for Oracle databases available as well. This cloning technology isn’t container-specific — instead it allows you to quickly provision a database on an existing SQL Server instance.

Meanwhile, Redgate’s Foundry has developed and is currently evolving Spawn technology with a set of customers in our concierge program. Spawn is a cross-platform, developer-centric service for provisioning databases in containers. The concierge program allows us to experiment and iterate quickly on the platform at the heart of the service.

These parallel lines of research and development allow us to provide world class service to our existing customers, while also developing very high-quality solutions for the customers in our near future.

 

Revolution 2 – Portable Database DevOps Automation Tooling for CI/CD

There is another major use for containers in database DevOps which isn’t related to provisioning databases themselves. In addition, containers will become the primary way that we access the tooling to perform automation tasks within DevOps pipelines.

 

Why containers are useful for build and deployment

One of the big benefits of containers has to do with easy portability for applications. Containers can empower you to avoid long installation processes and incessant patching and upgrade sessions. Why install, when you can simply pull the latest container (or the version you standardize on), and run the application right away?

Containers mean that DevOps processes can evolve to the point where build and deployment infrastructure itself is short lived and doesn’t require server resources to exist long term with tricky maintenance.

 

Examples: Redgate Change Automation, Flyway, and SQL Compare Docker containers

Redgate currently provides containers for convenient build and deployments in a couple of areas:

  • Redgate Change Automation for Oracle has a Docker image which provides support for building, testing, and preparing and performing releases
  • Flyway has a Docker image of the command line, allowing convenient execution of migration scripts
  • SQL Compare, Redgate’s industry leading comparison engine for the Microsoft Data Platform, has a Docker Container for performing convenient comparisons between databases, script directories, and/or version control repositories.

While many of our existing customers, particularly those in the Microsoft Data Platform space, prefer to use the convenient graphic plug-ins which we’ve built for popular orchestration tools like Azure DevOps, Jenkins, and TFS, we are seeing customer interest rise in using containers to manage automation throughout the CI/CD pipeline.

 

What will the Results be for CIOs and Shareholders?

I mentioned above that I saw a packed room for a Gartner talk on “Containers for CIOs.”

What’s in it for them?

CIOs have been undergoing a transformation for some years now. They are moving away from the role of heading up the old IT groups we talked about, whose role was to keep the organization alive for the minimum spend possible.

As a result of startups proving time and again that they can disrupt existing industries and win away market share from established Enterprises with a clever use of technology and a slick marketing campaign, CIOs and CTOs are now being challenged to foster innovation and to deliver value faster.

Containers are of significant interest because they help engineers meet this challenge, for the reasons described above in this post. In the world of databases, container technology is the foundation that will empower developers around the globe to use realistic environments to develop quality changes, fast.

It’s an exciting time to work with database DevOps: many of the seeds of ideas planted long ago are finally coming to fruition. Container technology, along with advances in cloud computing, are big drivers of these changes, along with the rise and popularity of DevOps processes.