The Future of Database DevOps
I work as Director at ThoughtWorks in the database and DevOps space. I’ve been here for 20+ years and I vaguely remember my first project at ThoughtWorks in 1999 when we had just started using Agile software development practices. The basic challenge we faced was how to move database changes at the same pace as application code and keep them in sync so that deployments would work.
At the time, we had to invent all the tools, processes, and techniques that we needed. Since then, database DevOps tools and processes have matured, and patterns have been established. All in the name of delivering software by bridging the mismatch that can happen between developers and data professionals.
But what about the next 20 years? What big trends will we see change the way we work with databases and DevOps in the future? I think there are four that are worth exploring:
- Rapid digital transformation
- The changing technology landscape
- NoSQL and polyglot persistence
- Applying continuous delivery to machine learning and AI
Rapid digital transformation
Even before Covid, the move to digital transformation was already happening, with every company wanting to become digitally native, recognizing that it’s no longer just for the Amazons or Microsofts of the world.
That said, Covid has changed customer behavior drastically and this will remain post-pandemic. Employee working patterns have changed, remote working is the norm and it’s led to a lot of companies going through digital transformation phase where they’re trying to come up with products and business processes to better serve customers during these challenging times.
This digital transformation can only be achieved by collaboration between business and IT, developers and data people, development, and infrastructure teams, and this is where DevOps really helps. At its core, DevOps is nothing but a practice where the development side and the ops side are collaborating to achieve successful software delivery and operations.
The other factor that is forcing a lot of digital transformation is the disruption driven by innovative business models. Newer types of companies and start-ups are trying to upend the business of established companies, with different technologies and ways of achieving their goals. This is prompting many to think about how they should change their business model which again leads to transformation, disruption, and an evaluation of current IT portfolios.
So, it’s imperative that we not only learn the DevOps methods of doing things, but at the same time get better at it. Nobody can take one step and say they’ve ‘done DevOps’. DevOps is a continuum that people must learn how to keep doing again and again, based on changing needs, architectures, and business requirements.
We must measure its success as well, and one key benchmark people turn to is increased speed to market. That’s been amplified by the pandemic, and businesses who could get things faster to users have had a better chance of reacting to the changing conditions and staying competitive. It’s also amplified the rate of transformation that’s going on and a digital divide has opened. It’s blindingly obvious that businesses which react and have a better speed to market have survived and probably gained market share, while those who are not able to do so are struggling in some ways.
In the years ahead, improving software delivery performance is going to continue to be critical so that businesses can move from concept to cash as fast as possible. It’s all about how fast a business can come up with an idea, and how fast IT can convert that into working software that customers are using.
The changing technology landscape
The technology landscape around us is changing all the time and that’s motivating us to learn and discover new ways of doing things that we haven’t seen before. There’s an explosion in the types of architecture styles and there are so many choices now, like micro-services, NoSQL, real-time technologies, the Internet of Things, and the cloud in all its flavors.
We need to manage this change and to do that, professionals like you and I need to learn, understand and then introduce them in the organization. If you’re the kind of person who wants to be ahead of the curve, you’re going to be looking at things that are going on outside your company in the industry and be passionate about some of these practices.
Learn them, do lunch-and-learns for your own team or company, and then try to introduce those practices in your company or in your team so that you can also become better at DevOps. Remember too that retaining and upskilling people is an important part of the digital transformation that’s going on. You can’t simply hire people with the new skillsets – people with domain understanding and knowledge of the current systems are vital.
If we want to focus on one area first, cloud is logical place to start, with the 2021 State of Database DevOps report from Redgate showing that about 87% of organizations now have a hybrid cloud strategy. That means some part of their enterprise architecture is in the cloud, which in turn means you’ll have to do deal with the cloud, and change your practices, processes, and tooling.
We also know that some of the cloud offerings reduce the workload on database people by taking care of operational tasks like backups. This raises the question of how else can you provide value to your organization? It’s a thing that we as professionals need to learn and I think that how well you learn and how much value you bring back to your organization is going to be critical.
NoSQL and polyglot persistence
The third trend I see is around NoSQL and polyglot persistence. In the Redgate report I mentioned earlier, 70% of people said they are using two different types of databases, and many also said they have a mix of on-premises and cloud servers.
It used to be that we either picked SQL Server, Oracle, IBM or one of four or five other choices, but now the choice is literally unlimited. Even though the top four databases in use are still relational, NoSQL databases are making a push and are getting higher usage.
It’s not just that either. The different architectural styles coming into play like micro-services or event-driven are introducing the notion of using the right database for the right problem. In the same enterprise, we may use a relational database alongside a key value store or a column-oriented database, a graph database or in-memory database, or a time series database.
With so much choice that needs to be taken into consideration, we need to have knowledge about the trade-offs that are involved in picking a certain database and how those trade-offs affect the enterprise. There are new discussions around trade-offs that need to be considered, like how do we manage this kind of data, how to ensure the quality of data, and how to take care of CAP theorem? We need to be aware and be able to have these conversations.
The NoSQL movement has also given us a way to think about heterogeneous tools. For example, the way we do database DevOps for relational databases is not the same for NoSQL databases. We should be thinking about new processes, tools and technologies needed.
Applying continuous delivery to machine learning and AI
The three trends I’ve talked about so far can be achieved by focusing on continuous delivery, which first emerged with the eponymous book by Jez Humble and David Farley back in 2010. The notion is that value is only delivered to customers when things are in production, so it needs to be done in a very iterative, regular, and continuous way. It’s the cycle of plan, code, build, test, release, deploy, operate, and monitor.
The basic idea at the end should be, are we delivering value, and how can we deliver that value faster? If we constantly keep asking that question, we will come up with innovative ways of solving the things we are facing in our organization.
The next place to apply the DevOps and continuous delivery mindset is in the machine learning and AI space. Every company now wants to do extract some value from their data by monitoring their data, learning from the data, and using predictive analytics to gain more insights from the data.
That’s a new space where I think DevOps for data as well as continuous delivery can be applied so that we can give consistent performance and release cadence to the business, so that the models we release, the data we have, the code we write around it is useful for the business. Not just in the areas we currently work in, but in new areas as well, so that we can transform ourselves and respond to the market conditions that are happening around us.
If you want to discover more insights from the 2021 State of Database DevOps Report, download your free copy.
You can also find out a lot more about DevOps by catching up on the recent online Redgate Summit, ‘The Future of Database DevOps’. The keynote from Pramod Sadalage formed the basis for this blog post, and you can watch him along with a range of other industry experts in a series of on-demand informative sessions.