Working as the Project Manager (was also the Technical Lead) of the qTrace team has been a satisfactory and eye-opening experience for me. One of the things I’ve learned is how testers work in an agile team. This post is my first attempt to share this experience. I’m not going to make some generalization about what agile testing is and how it ought to be done, instead I will tell you how we actually do it in our team and why it works for us.
First, I should elaborate why I think the qTrace team is an agile team for the word “agile” has been used in so many contexts that one could hardly know which agile is which anymore. (At some point I think we probably need to distinguish between good agile and bad agile but that might be another post.) Of course it could be simpler to claim that as our team practices the values and principles of the Agile Manifesto, we must be an agile team. Yet, I think some highlights on our actual practices establish a good context for what to be discussed next. So here you go.
We are a small team in which developers, testers and managers work closely with each other same room, open space, no cubicle
Our iteration is 1-week long with one internal release every week and one public release roughly every month
We do daily meeting although we regularly poke at each other for quick conversation throughout the day
We write minimal user stories based on which developers code and testers test
Whatever documentation we write, they are of immediately use, or we don’t write it (yet)
We embrace changes, allowing them to happen any time (although at times we do have to schedule a change until the next iteration)
We take code quality, good design (SOLID) and refactoring seriously (I was ultimately the code police, smelling every line of code; now that this job belongs to the new Tech Lead, at times I still can’t resist the temptation of sneaking into the code and providing feedback to the team)
While we don’t achieve 100% unit testing coverage, we do write comprehensive automated unit test for complicated parts of the system
So, that’s the team. The rest of this post devotes on the practices of our test team. Hopefully it would be informative and educational for testers who are (or will be) working in agile teams.
Our testers focus on delivering value to the customers.
Instead of trying to test the software in a rigid manner and report any single deviation from the user stories, our testers try to put themselves in the shoes of customers and ask “is this non-conformance necessarily a bug?” and “is this conformance necessarily NOT a bug?” That way, they don’t just test the software based on its specification, but also provide feedback on the fly to improve the specification itself. They understand that there’s no point in complying to the spec if the end result isn’t what benefiting the customers. Some might ask, isn’t it the job of product management to decide what features the system must have? True, but everybody can offer their insight to help product management make a better decision. Testers are probably in the best position to provide that kind of insight for they are essentially users of the software, very regular users indeed.
Whatever tasks they have to do, be it about writing a document, preparing test environment or calling up a meeting, our testers ask “is this going to add value to the team?” If not, they will not do it. For they already have many important things to do (trust me, there’s no shortage of testing to perform when you release a software every week including one to the public every month).
Our testers don’t write test cases! That might be an alarm for some people, but it works for us. After all, what good in writing test cases many days or weeks beforehand them if requirements or priority could change any minute? Instead, our testers perform exploratory testing in which they design and execute tests on the fly. Bad for repeatability? Probably, but we decided that this is a luxury we can’t afford and, luckily, don’t need since our testers so far have no problem repeating their own testing. What if there are new testers, will test cases be much helpful then? Again, probably, but then we will try to hire the best testers we can and we think these testers generally don’t want to test based on test cases designed by someone else. At some point of time when our team becomes much larger and our software become much more complicated, a certain amount of test cases might be useful, but it’s not now and we generally don’t worry much about things that are not happening now. (Developer audiences should quickly recognize the spirit of YAGNI although in the context of speculative test design.)
Quick feedback loop
The conventional testing workflow works this way: testers test and submit bugs with as much detail as possible, developers read bug reports and try to reproduce and fix those bugs with little communication in between. While our testers do try to submit self-explanatory bugs, they are also very quick in engaging in direct conversation with developers. After all, that’s the reason why we employ an open-space no cubicle project setting. No amount of detail in a bug report can beat a 30-second conversation to clarify and agree on things.
Beside, instead of employing a separate and sophisticated bug tracking system to track bugs, our testers submit bugs to the same system that keep track of user stories. That way, nobody has to access to multiple systems to stay on top of all things need implementing, fixing and testing and everybody is aware of what everybody else is doing and providing support as needed. (For me, it’s pretty nice having a single dashboard showing all the completed and remaining tasks in an iteration.) Besides, we treat each bug like a user story, so bugs are also estimated, prioritized and planned for a release just like user stories. That certainly simplifies our process significantly.
Unlike many teams in which testers are treated like consultants, providing “testing service” to the team, testers are a true part of our team. Their insights and value are simply too great to not being utilized all the time. In our team, testers are involved in activities that they wouldn’t have to in a different team.
For example, our testers present in refactoring sessions! Why? Because they know about the quality of our software better than anyone else. They know which bugs have been closed and reopened several times (this usually links directly to a piece of badly written code). They know which bugs might cause deep frustration to the customers (this helps refactoring prioritization). They know which problems happen in which previous build but not another (this provides some clue to developers in finding root causes). (In fact, even if they just sit saying nothing, their presence alone assures that no developer can claim his code doesn’t need any improvement.) All these insights certainly help our developers do a better job in improving their code.
We engage testers in release planning and estimation process, not only because they need to understand the features well, but also because we need their input to improve these stories, prioritize and estimate them . We engage testers in daily meeting, for they can quickly express concerns about any critical bugs or risks without having to wait a week or so or have to go through some kind of formal reporting channel. If testers know something important, everybody else should know as soon as possible.
Alright, that’s pretty much everything I want to convey in this post. I’m sure the list above isn’t exhaustive and it’s hard to capture every single practice our test team employs in one single post, but it should be a good start. I am also aware that this is not meant to be some sort of definitive guide to agile testing, instead just a bunch of things our team embraces and have worked well for us. Just like any other agile team, continuously learning is built in to our DNA (of course working in an “agile team” doesn’t make you suddenly able to continuously learn, but more likely because we have the right people in the first place that we could become an agile team) and we surely need to learn, adapt and improve ourselves. Feedback and ideas, please share in comments.