Get Out of Excel Test Management Hell!

excelhell

Test Management in Excel is Hell!

It’s time to get real; using excel for test management, documentation and reporting on testing activity can be an extremely cumbersome, inefficient, time consuming and at times, a frustrating process. It can be Hell!

excel test management

Perhaps this sounds familiar…

It seems like such a good idea at first. You want to understand what is happening in testing — to get your arms around it. So you create a spreadsheet. After all, Excel is quick and easy to setup – oh, and everyone has it… plus it’s “free”.  Between this and getting Google Drive setup for collaborating, storing and tracking testing artifacts we’ll be good to go.

But new features bring new spreadsheets. Each new release involves copying all the spreadsheets, putting them in a folder to mark the current release, then going into each spreadsheet and checking them off. Tracking who is doing what, of course, takes another spreadsheet, plus possibly something to track bugs.

Soon you are drowning in spreadsheets. The very thing that was supposed to create order has created the haystack syndrome, where everything you need to know is in a spreadsheet … if you can only find it.

Before you know it you are in Excel Hell; the intersection of testing documentation burden, and time pressure. At this point, testers are spending more time working around the documentation and spreadsheets than they are testing software. 

But wait, most testing teams know this.

captian obv

Teams are not in the dark about the inefficiencies of Excel for test management, yet they often stick with it instead of investing in a test case management tool or new processes.

Why is this? Let’s take a good look at one of the most popular reasons why testing teams use excel:

Excel is quick and easy to setup and if it becomes an issue we’ll figure out how to manage it all later.

software testing excel

Management says, “Let’s just get the ball rolling with Excel. We have work to do and we’ll sort out our options later.”

This is great and certainly a quick fix, but you can’t just start creating spreadsheets, there are lots of things to think through:

  • Creating templates and deciding formats for Requirements, Test Scenarios, Test Cases, Defects, Traceability Matrix, Reports, etc.
  • Identifying the acceptable values for fields such as Test Execution status (Passed vs. Fail, Unexecuted vs. No Run, etc.), Importance (High, Medium, Low) etc.
  • Deciding on naming conventions across testing artifacts.
  • Figure out what format you would want your screenshots to be in (JPEG, Word etc.) and how often should teams record their screens – with every test case or when they encounter defects?
  • Zeroing in on a folder structure and set up a shared/cloud drive – most of the time on Google Drive or Sharepoint.
  • Set up processes in place and create documentation around these processes.  (How should the documents be updated by who? How often? When should a task start? How to track progress and use a report template? etc. )
  • Communicating your defect reporting and tracking templates to your development team and getting their ‘go-ahead’ via email or chat.
  • Creating report generation templates after receiving inputs from stakeholders as to what charts and information they are interested in receiving periodically.

Most of the above listed tasks such as deciding on naming conventions/folder structures, regular communication to development teams, processes establishment and documentation are essential to managing testing for any Release or Sprint. However, all of the spreadsheet planning, building, tracking, storing…then the maintenance and management of the spreadsheets…it’s time consuming, inefficient and slowing down the testing process.

Soon, testers are spending more time managing testing documentation than testing.

Let’s take a look at how the “quick and easy to use” mindset gets testing teams into documentation hell.

Documentation Hell

software testing excel template

Testing teams are no stranger to the often monotonous and time intensive daily grind of using Excel to manage the testing documentation. But if we take a closer look, we can see how using Excel creates problems and project wide inefficiencies. These inefficiencies slow down the testing process, create bottle necks and end up becoming a huge time suck – it can become Hell!

Here are some places where testing documentation feels the Excel pain:

Test design:  QA teams normally use at least one excel document/sheet each to track requirements, test cases, test runs, a traceability matrix and defects. Even for the smallest sized project, this will mean five different excel sheets per release. When you separate these sheets per module or per tester or as releases keep growing in numbers, you are faced with an overwhelming number of documents.

Test case maintenance: If a particular requirement undergoes changes over few releases, it is hard to trace the test cases that have to be modified accurately, as traceability across multiple releases is often missing due to its difficultly to accomplish. Therefore, your test cases might not be up to date at all times.

Questionable integrity of documents: Since documents undergo changes by multiple people, multiple times during the day, there are no guarantees that the document contains the correct information at all times.

Prone to human errors: Unintentional changes or disruption to format and computational logic happen all to often, which causes the document to become inaccurate and even useless. These spreadsheets are highly fragile documents and a lot of things can go wrong due to:

  • Reliance on Folder tree structures for organizing and archival.
  • Coloring, grouping, merging etc. used as primary ways to categorize and track status.
  • Filtering and templates with macros/pivot tables used for report generation.

When human error occurs, time is wasted through troubleshooting and debugging. If troubleshooting and debugging becomes necessary, the artifact is unavailable for use causing a loss of productivity. Lack of control does not help in mitigating human error on folders, fields and scope for user’s data entry errors. This implies that process adherence merely is a guideline and there is no way to enforce the rules due to the lack of control Excel has with permissions and user based roles.

Test suites for execution: Since multi-release traceability is extremely complicated to achieve due to multi document referencing, regression testing is never complete. Not knowing what test cases/modules are impacted by a requirement revision will create gaps in the testing efforts and prevents reaching 100% test coverage.

Lack of centralization, too many folders, too many documents and too many sheets all over the place.  It takes a couple of minutes every day to find what to do next and which is the right document to use. Also, to point to a particular area within the document references such as cell 5RX2C are used, which do not make any intuitive sense.

While testers are busy spending more time managing documentation via spreadsheets and Google Drive, managers have a different burden and end up in reporting Hell.

Reporting Hell

Mangers don’t have to use Excel to perform individual pieces of work. Instead, they want to use the spreadsheets to track the progress of the test project, or of the testers, likely against a plan.

A single tester on a single product can provide a status report on what’s known and what’s left to cover – but that won’t work for a larger team, for multiple projects, or even for a single tester where regression testing takes more than two or three days.

That higher level view, the pulse on the project, takes a combination of reports emailed in, checking the bug reporting system, and plumbing the depths of Excel files stored in different places for different teams. Some managers create status sheets so team members can mark off what they have ‘done’ – in many cases these special status spreadsheets differ from the actual marks on the folder for the release, which is different than what the tester actually performed.

Let’s look at one of the most basic level, standard reports for managing and assessing testing projects – a test status report.

While status reporting is foundational throughout the testing process, it becomes a more important activity during the test execution stage of a project. Producing a basic report to satisfy the need to track and assess test execution status/progress using Excel requires test managers/leads to spend a substantial amount of time digging through test case execution documents, parsing data, building pivots, writing macros and applying filters.

Assuming there were no human errors, breaks in communication on updated document versions caused by Excel, all this raw data of individual test case execution when consolidated looks something like the below image.

test status report excel

ALL that time and effort just to collect the data and get it organized into a format where you can now build the report. Usually this data is manually entered into a template to generate report and the resulting reports could look like this:

test status report

Even though, the above report looks great and it is, it falls short because the charts or tables are the ONLY analytics available. If you want to tailor a report like this to show status/defect density per requirement/platform/module, you will have to generate it from scratch.

Taking it a bit deeper, let’s say you wanted to see a report for a release tested a few months back. Chances are you probably don’t have or can’t find that information – at not least easily and quickly.

The point being made here is all of this data is time intensive, cumbersome to manage, prone to human error and hard to collect in the first place. The team as a whole experiences wasted time and resources through the cumbersome coordination and management efforts that can amount to 15% of entire test timelines.

This problem has the propensity to overrun projects and delay release because of the additional micromanaging, follow up, data entry tasks, tool troubleshooting or debugging, cleanup activities that are not accounted for in the effort estimations.

Ok, we get it.

So you’re saying that we need to get a test management tool? Well, that will certainly help and here’s a great list of options from softwaretestinghelp.com. But, “A fool with a tool is still a fool” Grady Booch.

pity the software testing fool

As the popular saying goes, tools don’t solve problems by themselves; people do.

Realistic expectations combined with small process and team operational changes as listed below can really set you up for success.

  • Tools do NOT eliminate processes or create them: Tools cannot tell you how to do your work. But when you do get to it, they can help you manage it better. Therefore, how you want to review, what reports to generate and how often, who should the defects be assigned to, how to set the severity to a defects, etc are examples of tasks that have to be defined by your QA process.
  • Activity tracking: Teams should not also expect zero activity tracking, as it is still a part of how we do our work, only it’s easier, less burdensome and in a centralized place.
  • Define rules of tool usage are to be defined: Take the time to decide few operations details such as who needs to have access, whether teams should have delete accesses, if the assets have to be imported or created directly, etc. by taking into account who is going to use it, for what and how. The tool cannot know your need and preset these for you and your team.

Once again, it’s time to get real. Managing test cases, documentation and testing activity data in Excel is a very cumbersome, inefficient, time consuming and at times a frustrating process; it’s hell!

Choosing a test managment tool, updating test management processes and thinking differently will help testing teams start marching out of Excel Hell.

Learn more about Excel Hell.

This post has contributions from Swati Seela and Justin Rohrman.

Need help selling the value of software testing to your executive team? Download our free guide – Executive Value Guide: Making the Case for Modern Software Testing Tools.

Leave a Reply

Your email address will not be published. Required fields are marked *

More Great Content

Get Started with QASymphony