Getting There: A Roadmap To Improving Product Stability Through Continuous Improvement
If you set the Way back machine and visited a Quality Assurance team 15-20 years ago, you would return with a greater appreciation for how testing has evolved. Back then, we were reading requirements, preparing test cases, and waiting for what seemed forever for code delivery. When they completed the code, it was "thrown over the wall" to QA and we began executing tests. Issues were found; some were valid; others induced conversations around what a requirement "said" and what it "meant." Bugs were fixed and retested until finally a decision was made that the code was "good enough" to release.
So how did we escape those disciplines? More importantly, if this describes how you work now, what can you do to move forward? Let's review a couple of examples from my experiences to show some paths forward. I believe continuous improvement helps to fix your most critical issues and builds upon each other over time.
Trapped under the waterfall: When I began one job as QA director, the company was in its 18th month of developing a new point of sale system using a waterfall methodology.The QA team had been hired but had not actively worked on the project yet. The team scrambled to review all of the requirements and built the test cases. The first release came and there were many defects and requirements misses. After many months, the product was released. The team got faster at interpreting the requirements; they asked more questions and became more and more prepared for the subsequent releases.
Fix #1: The Company moved QA from the Thanksgiving kiddie table to the grown-up table by adopting quality a priority. Changing mindsets made everyone consider quality in their jobs. The results? They fixed issues earlier and earlier. We scrutinized the requirements more closely. There was an understanding that the time spent to identify a missed requirement now prevented project delays in the future. We identified dependencies to prevent future blocks in the schedule.
Agile for the win: I've been through a couple of adoptions of agile at different companies. It's easy to start an agile shop. It's far harder to createasmooth working agile shop. In the beginning, it makes sense to executives; retool their teams to develop smaller packets of code and release it more often. However, getting across the concept of "give and take" is much harder. Leaders quickly start to wonder if they made the right choice when their "must-have" item is scheduled behind three other higher priority "must-haves." Get a coach in to train the executives and make sure they are on board. There will be some rough spots but you will become a more productive team that is delivering code more quickly.
With the advent of new methodologies, DevOps, AI, BDD, the next phase centers around further shortening the time to market by moving the testing even further left into the development space. It's shaping up to be another wild ride, but that's why we love it
Fix #2: The Companywent agile and refactored our old requirements as features, stories, and tasks. QA was involved in the planning stage earlier than it had ever been. They reviewed stories and gained an understanding of the expectations of the product owners, and they provided estimates for testing. An equal seat at the table meant that all sides had an equal voice. That development fix that might only take 10-minutes may have 25 different testing scenarios and take half a day. Estimates became better than ever and the team got faster but more challenges loomed ahead.
Regression takes how long?!?! Automation tools allow QA to take repetitive, manual tasks and create scripts to execute them much faster and more accurately than testers could. These scripts come with some overhead; you needed people with the skill to write and maintain the scripts, you need a prioritized list of scenarios you want to automate and you have to convince people why not everything should or even could be automated. Lastly, it's going to take time to complete.
Fix #3: In the beginning, manual regression scenarios took the entire 14-person team two weeks to execute. Automation was the next logical step in the journey, and we set out with a goal to reduce manual regression time by 75%. We hired an Automation lead, who built a regression framework in Selenium. While that went on, the rest of the team documented all the scenarios, identified what could and should be automated. In about a year, we had created scripts that covered about 80% of the regression scenarios. We were now able to execute in a few hours what used to take weeks. Manual regression dropped from 1120 hours to 80 hours or just a couple of days for the team.
What's Next? With the advent of new methodologies, DevOps, AI, BDD, the next phase centers around further shortening the time to market by moving the testing even further left into the development space. It's shaping up to be another wild ride, but that's why we love it.