Testing before coding: shifting farther left

Comments 0

Share to social media

Times have changed so much; when I wrote my first book on database design, I didn’t mention testing. In some respects, I thought it was obvious that everyone would gather requirements and test to make sure what they were building did what they expected. Let’s say I was younger then and considerably more naïve.

The Story

As a data architect, I typically found most of my problems in life just happened to occur less because of coding bugs and more because what I had designed didn’t match what the customer actually wanted. One of my first significant design mistakes came when designing a chemical plant QA system. Our analysts gathered requirements, gave them to the development team, and we implemented them. We did system testing and quality testing with the customer. There were poor time estimates, and cost overruns, but the software worked and worked, and our team believed it worked quite well.

Then we went live. There were lots of typical minor bugs, but everything was working great. Until the day the system would not let them do something that was out of the ordinary. One of the key requirements we were given was not to allow shipments if the material didn’t meet a certain quality. Eradicating previous quality issues was the central goal of this new system. What was not discussed was what happens if their customer wants to override that quality setting and take shipment anyhow? The team spent considerable time making sure that our system could not ship sub-standard product because that is what the requirements said.

The problem came down to the fact that we didn’t understand what the customer wanted in a way that cost them money. Delayed shipments and manual processing were required because the system wouldn’t allow them to do something that was apparently commonplace.

Test Very Early, Test Very Often

To fix this, I have always tried to test early, earlier, and earliest. Start before coding has started with the requirements. Find a way, in the customer’s language to verify that the requirements you have stated are accurate. Challenge requirements that are too specific. “We only see patients 18 years and younger” often means “We rarely see patients 18 years old and younger unless we decide to. In this case, there is a process that needs to be a part of the system, or it will be a horrible failure, but at least you will have a story to write about someday”.

It means getting representatives from the user community (not just upper-level management) to interact with the requirements that have been agreed upon and make sure they make sense. Challenge them to give you the odd (and very odd) along with typical cases. Some significant concerns happen once a year.

The farther you are left in the timeline, the easier it is to make changes. If it is still a document, just hit backspace and type the new requirement. If the business analyst had identified the override, or our team had asked the question “really, you can’t sell substandard materials to anyone?” money would have been saved.

The Process

Once we get to the “really sure” level of confidence that the requirements are correct, you can design more confidently. As a data professional, data models have a few natural checkpoints built into the process that can be very helpful. Start with a conceptual model that is just table names and projected relationships (the tables are concepts, hence the name conceptual model). Then test to make sure you can meet the requirements.

Take each requirement and just ask, “can I?” Can I store customer information? Can I represent an order? Can I represent an order of 10 products? 100 products? 10 orders in a minute? Not just the normal data you know of, but as many possibilities that could happen. Once you can meet that level of comfort with your model, you are getting close to a design that will only be decorated with attributes, not torn apart, and restarted over and over.

As you flesh out attributes, then start creating tables, constraints, etc., unit testing, integration testing, and then user acceptance testing are done hand in hand with the entire process. Every step of the way, verifying that what you have built meets the requirements you tested for veracity with the user community in the beginning.

Whether you are working in short sprints or long waterfall projects, the main difference is the amount of work you are doing. Continue this process of verification all the way through until you finish and start to work on the next thing.

Nothing is Ever Easy as It Sounds

The entire development process can be pretty easy except for the first step. Gathering requirements that meet what the customer wants a system to do is one of the most challenging jobs in existence that doesn’t require a Ph.D. in Mathematics. Why? Because most users are clueless as to how to tell you what they want most of the time. And the worst offenders think they know what they want without discussion.

Once you have requirements, designing and implementing software to meet well-written requirements is not necessarily trivial, but it is relatively straightforward. Luckily there is lots of software out there to help you with those tasks.

 

About the author

Louis Davidson

Simple Talk Editor

See Profile

Louis is the editor of this Simple-Talk website. Prior to that, has was a corporate database developer and data architect for a non-profit organization for 25 years! Louis has been a Microsoft MVP since 2004, and is the author of a series of SQL Server Database Design books, most recently Pro SQL Server Relational Database Design and Implementation.