How to convince your boss you need a Test Data Management solution

If you’re a developer, you may well be familiar with the scenario. You’re writing code and you want to test your changes against a copy of the production database. One that’s representative of the size and distribution characteristics and retains referential integrity so that, when your changes are released, they won’t result in a failed deployment that will take hours to resolve.

But there’s a problem. The database copy you’re working with isn’t up to the job. It kind of works but it’s not reliable and you’re still not sure if the tests you ran really did reflect what will happen when your changes go live.

You’re not alone. Bad quality data which isn’t accurate or complete is a common issue for developers when working with databases. While database copies may well be provided, they can be out of date, incomplete, or subsets of the original with random data used in place of the original data.

There’s no one to blame here, incidentally. Provisioning database copies can take hours, sometimes days. It’s time-intensive, difficult, and there are often worries about sharing Personally Identifiable Information (PII) which needs to be replaced, masked or de-identified.

It is a sign, however, that your team needs a Test Data Management (TDM) solution in place. TDM changes the game by introducing a structured process for provisioning database copies that are accurate, realistic and truly representative. At its best it automates the classification and masking of sensitive data and uses virtualization and container technologies to provision lightweight copies a fraction the size of the original.

It makes the provisioning of database copies easy – so easy that they can be provisioned and refreshed in seconds. It also makes testing proposed changes fast, consistent, accurate, repeatable and reliable. All of which is great for you, but how do you convince your boss? You talk in business language instead.

The business advantages of test data management

The disadvantages you face when working with inaccurate or incomplete data don’t just slow down development, lower the quality of code and lead to failed deployments, all of which make your job harder. They have ramifications right across the business. Ones that increase costs, introduce uncertainty, affect customer relationships and make the company you work for less competitive.

It’s about saving money

First up, there are the directs costs involved. Working with other customers, we’ve seen the introduction of a dedicated TDM solution typically save a minimum of 15% of developer time by providing them with dedicated development environments, and by streamlining testing with improved test data. DBAs also save 10% and more of their time with processes that automatically classify and mask sensitive data, and provision those development and test environments.

It doesn’t sound a lot until you do the math. Surgical Information Systems used Redgate technology to create virtualized database copies that are a fraction the size of the original, and can be created, updated and refreshed in minutes. This saved a minimum of 12 hours a day across all of its team, equating to savings of $268,320 per year1.

It’s about increasing productivity

Next, there’s a direct correlation between productivity – the speed at which you write and release quality code and avoid rework – and TDM. With TDM, developers can be provisioned accurate, truly representative database copies that they can reliably test their changes against. They are also able, with an advanced solution in place, to self-serve those copies and refresh them when they need to in seconds.

As a direct result, development and testing is faster, errors are caught much earlier in the development pipeline when it is far easier and far less costly to correct, and the chances of breaking changes reaching production and causing deployments to fail are minimized.

This change failure rate is one of the key metrics used in the annual DORA State of DevOps Report to measure the speed and stability of software delivery. Elite performers in the latest report have a change failure rate of 5% compared to 10% for high performers, 15% for medium performers and 64% for low performers. Even by making a marginal improvement, the time spent recovering from failed deployments, reworking code and correcting errors can be reduced and productivity can be increased.

It’s about avoiding unexpected costs

Thirdly, there’s the direct result of those failed deployments. If a breaking code change hits the production server where high availability is often a business-critical issue, businesses suffer. ITIC’s 2023 Global Server Reliability Survey revealed that a single hour of server downtime can result in potential losses of $300,000 or more for 93% percent of mid-sized (SMEs) and large enterprises. Among that 93% majority, over half said hourly losses would cost from $1m to over $5m.

Even if the potential losses are lower, it’s the ongoing possibility that it may happen that’s important. I like the way James Phillips, CIO of software company Rev.io, puts it in his article, Why test data management is becoming increasingly important to the C-suite:

“The value test data management brings to your organization, whether we’re talking from a compliance or customer confidence standpoint, is huge. It’s hard to put a dollar value on, but when you look at the dollar value of not doing it, and being faced with the consequences of what can happen, it’s much worse.”

It’s about delivering value to customers faster

Finally, there’s the more attractive advantage of releasing changes, updates and features faster. More importantly, that speed doesn’t come at the cost of consistency, reliability or code quality because TDM introduces a structured process which can work across developers, across teams and across databases. It works from day one and then it continues to work by removing the typical – often accepted – bottleneck of the unavailability of test data.

As Ryan Burg, DevOps Manager of Surgical Information Systems which introduced Redgate TDM technology comments: “We want to get new features in our customers’ hands. Now we can do twice the amount of testing in the same amount of time, we have better testing, we’ve lowered that cost, but we are also lowering the time to market.”

Next steps

Making the case for introducing a TDM solution can seem daunting but talking about saving money, increasing productivity, avoiding unexpected costs, and delivering value to customers faster shows how it offers real business advantages.

You might also like to share the following resources which discuss it in more detail.

Test data management is one of the key ways enterprises can deploy changes on-demand while also doing so reliably and safely, reducing their change failure rate – and gaining a demonstrable ROI. Read the blog post, Where’s the money? The ROI of test data management.

The leading independent analyst and research house, Bloor, thinks enterprise-level organizations are increasingly and acutely aware of the benefits of a TDM solution. Download Bloor’s Test Data Management 2024 Market Update.

Enabling DevOps Test Data Management can improve release quality, reduce business risk, and deliver value to customers sooner. Find out how.

Redgate Test Data Manager

Improve your release quality and reduce your risk, with the flexibility to fit your workflow.

Learn more about Redgate Test Data Manager

redgate test data manager logo