Today was a good day for quality. Your development group rolled out some of its best code yet and your ops team deployed it without a hitch. Your boss is happy too, because you’ve built a high-performing QA process that reliably intercepts critical risks – the kind that might compromise security, customer experience or stock price.
But what about tomorrow? Quality is a journey, and your team strives for better speed and efficiency, every year and project. How can you maintain the quality standards you’ve achieved, even as the development cadence accelerates and processes, people and resources change? How can you “future proof” your QA effort?
Four Threats to QA Stability
Before you can answer that question, you need to answer two others: First, what do you need from your future QA team? Second, what changes or threats might break your QA team vision and turn today’s smooth quality into unreliability and delay?
Regarding future expectations, where does your QA team and process excel today? Testing speed? Risk management assessment? Labor or cost efficiency? Where does it struggle? Where could it achieve a transformative improvement? What resources and leadership does that require?
For the latter questions, what might jeopardize your quality objectives? Much like quality itself, QA processes can be relatively fragile. Disruption typically comes from one of four areas: people, environment, speed and money.
Quality starts with people – skilled people who’ve mastered your solutions, QA tools and best practices. However, skilled QA resources are a coveted asset and being able to attract, develop and retain them is a hidden vulnerability for companies of all sizes.
There’s a limit to how hard or fast people can work before encountering escalating risks of exhaustion and mistakes. Eventually, even maintaining progress requires additional staff that may be unavailable. As QA processes and tools proliferate, it’s also common to see essential operational expertise and access concentrated around key specialists. This makes these individuals more valuable, but it can create bottlenecks that impede results and team utilization. Even worse, it may create devastating “single point of failure” vulnerabilities if they can’t meet critical objectives.
In a competitive IT job market, turnover is not a matter of “if” but rather “when.” How would losing a key person potentially impact your team? Would essential knowledge and expertise walk out the door with them? Are their skills and abilities redundant, or could an untimely resignation jeopardize QA results you’ve worked hard to develop? Recruiting and training a replacement may be impractical in an aggressive release cycle. A truly “future proof” QA team will have built-in resilience and efficiency, in the form of documentation, thorough cross-training or both, that lets the team perform effectively, even when shorthanded.
Environmental changes are another disruptor, especially when related to established processes. QA that worked for traditional waterfall development may be ineffective or cumbersome as organizations shift to iterative agile or DevOps methods. Bimodal development approaches with conflicting QA requirements may coexist. There may also be increases in developer autonomy and the use of microservices in place of monolithic applications and development environments. These additional development complexities can lead to an unmanageable patchwork of incompatible testing tools and processes. As a result, quality can suffer.
Changing end-user dynamics are another variable. For instance, reasonable quality standards for a sleepy back-office server application might be grossly inadequate when applied to a customer-facing, cross-platform mobile application. As IT quality outcomes become a core driver of factors such as customer experience, market share and security, QA requirements change in parallel.
While some environmental transitions are easy, others are not. Like the dinosaurs, rapid environmental changes may impose “adapt or die” pressures on a QA model that previously performed well. Truly “future proof” QA can thrive across the current development spectrum –waterfall, agile, DevOps – and whatever comes next.
There’s a reason why one rarely sees billboards for “The fastest dentist in town” or “3-minute haircuts!”, and highway speed limit signs are everywhere. Doing a job too fast is an easy way to ruin quality. Unfortunately, market factors push IT to get increasingly complex software out the door faster than ever before. As serial waterfall-style test cycles get lapped by iterative sprints and continuous delivery, QA testing needs to get faster and smarter.
Speed problems are compounded if the engine lacks the required horsepower or if there’s incomplete visibility of the surrounding road. Like a turbocharger and good headlights, a “future proof” QA process gives the QA team both the power and insight they need to accelerate, avoid hazards and stay in control.
A building without a solid foundation is a sand castle. Quality assurance is the solid foundation high-performance software is built upon, and on average, it comprises about 26 percent of total IT spending. Unfortunately, QA budgets wax and wane and even short-term QA spending cuts can have devastating long-term impacts to the overall success of a product.
In the face of conflicting priorities for additional development staffing, research and retraining, it’s easy for QA funding requests to get cut. This may mean critical QA initiatives are denied or diluted with incomplete patches. Good intentions aside, an incompatible and unsustainable matchup of single-purpose test apps undermine and handicap the QA team. QA needs a sound and scalable platform that can thrive in any funding environment.
With these challenges in mind, what makes for a truly “future proof” QA team?
Four Attributes of a Future-proof QA Team
QA needs versatile players and a versatile testing platform that’s easy to use, consistent and integrated with core development systems and workflows. A versatile system breaks down functional silos and minimizes training and system integration friction. It empowers teams to collaborate and share work, and it minimizes labor waste due to hand-offs and skill gaps.
Distributing QA functions across development and testing teams builds and reinforces quality at every step. It allows product owners, developers and testers to drive quality results together. It transforms QA from a discrete, post-development “carwash” into an integrated function across the development lifecycle. This approach pre-empts serious issues, improves documentation and minimizes the delays and risks of conventional serial testing.
Scalable QA keep teams on track as development speed and complexity increase. Scalability leverages a versatile, distributed model to deliver progressively better results, faster. It’s fueled by customizable and reusable QA tools that align with any development process. Automation accelerates and extends team expertise and resources. It also uses a flexible platform and proven processes to deliver results, without the shortcomings of one-off solutions and obsolete tools.
Future-proof QA also includes deep visibility into QA metrics and results and the ability to share compelling insights across the team and with executive audiences. This builds on actionable analytics that feed real-time optimization and ongoing learning into software quality. It also supports effective budgeting, risk management and payback validation for quality investments.
While change is inevitable, thoughtful leadership can enable a resilient QA team that that’s ready to face potential threats and seize opportunities.
We just need a little info from you.