Agile Testing

Over the last year, I’ve been pursuing the question of “where do testers add the most value on agile projects?”  I presented some of what I learned at PNSQC 2004 (see my paper, “Agility for Testers” )  I figure it’s time to write a bit more about what I’ve learned.

The more time I spend with Agile teams, the more I like Brian Marick’s division of testing activities into business-facing and technology-facing.  In particular, I find when working with an Agile team that I’m doing one of two things at any given time:

  1. Augmenting the programmer test effort
    (technology-facing)
  2. Augmenting the customer acceptance test effort (business-facing)

Note that neither of these activities involves independently assessing the software myself.  That’s an important difference between testing on a traditional project and testing on an agile project.

What’s Different?

On traditional projects, folks with Quality somewhere in their title (Quality Assurance, Quality Engineers, et al) perform Independent Verification and Validation (IV&V) activities to assess the quality of the system.  Often these teams also review design artifacts.  Sometimes they also have a hand in defining and/or enforcing the process by which the software is made. 

Agile project teams generally reject the notion that they need an independent group to assess their work products or enforce their process.  They value the information that testing provides and they value testing activities highly.  Indeed, Extreme Programming (XP) teams value testing so much, they practice Test-Driven Development (TDD), writing and executing test code before writing the production code to pass the tests.  However, even though agile teams value testing, they don’t always value testers.  And they’re particularly allergic to the auditing or policing aspects of heavyweight, formal QA. 

So how can testers make themselves useful on a team that does not see much use in traditional, formal QA methodologies?  Here’s what I’ve been doing.

Augmenting Programmer Testing

When augmenting programmer testing, I support the programmers in creating the software. I sit in the bullpen with the programmers. Sometimes I work solo, sometimes I pair with programmers.

If the programmers are practicing XP, it’s a given that they have an extensive set of unit tests.  My role is not to duplicate their unit test effort.  Nor is my role to do the programmers’ unit testing for them.  Instead, most of my work involves manual exploratory testing to discover important information about the software that the unit tests failed to reveal.

In order to do this, I have to:

  • Get and build the latest source code.
  • Run all the unit tests to verify I’m starting from a “known good” place.
  • Run the application (usually locally from the IDE).
  • If needed, write setup code to make it possible to test conditions that I cannot set up through end-user-accessible interfaces.  So while I say that I’m doing “manual” exploratory testing, it’s usually programming-assisted manual testing.
  • Occasionally add to the automated unit or acceptance test suites.
  • Pair with programmers on testing or test-related programming tasks.

Unlike unit tests that use mocks and stubs to isolate the code under test, my exploratory testing is as end-to-end as possible.  As a result, I’m usually able to find issues and risks that the unit tests don’t reveal.  And I’m finding them sooner than we would if we waited for the customer to try things end-to-end.  (Early feedback is good.)  In every instance where I’ve done this, I’ve found serious bugs that the unit tests didn’t catch.  In doing so, I not only help the programmers improve the software, I also help them improve their unit tests.

Augmenting Customer Testing

Here I’m using the word “Customer” in the XP sense of the term: the set of people who represent the business-facing stakeholders on a project.  I’m not referring to the customers who pay for the software.

When augmenting customer testing, I support the business-facing decision makers by helping them define and execute acceptance tests. Specifically, I:

  • Walk through the existing software with them, using an interview-style conversation to surface assumptions and expectations.
  • Use the information from walkthroughs and other sources to design and articulate acceptance tests.
  • Use a variety of analysis techniques to discover risks and implications of decisions.
  • Help the customers and programmers define what “good enough” means for their context.
  • Execute acceptance tests manually where needed.
  • Automate acceptance tests (either by automating them myself or by working with a programmer or existing automation engineer with domain skills).
  • Provide metrics or other high level data as needed to help the XP Customer satisfy his management’s need for numbers.

In this second role, I spend as much of my time facilitating communication and clarifying expectations as I do designing and implementing tests.

What Does This Mean for the Professional Tester?

I’m assuming that anyone who considers themselves a professional tester already possesses solid testing skills and domain knowledge for their particular area.  If you don’t, you ought to consider working on those skills no matter what methodology your development team is following.

Now, if you’re already successful as a professional tester and you want to improve your ability to work with an agile team, consider the two dimensions Brian Marick identified, technology-facing and business-facing.

Technology-facing: To augment programmer testing, you’re going to have to be comfortable mucking about in source code. 

If you haven’t coded in a long time (or ever) that means more than just learning Java or whatever language your organization uses.  You’ll also have to be comfortable:

  • Working in the development environment the developers are using, whether that’s Visual Studio, Eclipse, IntelliJ, or something else. 
  • Fetching and building the latest code from the source control system.
  • Using the test frameworks the developers are using such as jUnit or NUnit. 
  • Configuring your own system and setting up your own data, so you may have to learn more about operating systems, networks, and databases.

In short, technology-facing testers become members of the development team.  You’ll need to grow your technical skills accordingly.

Business-facing: To help business stakeholders articulate their needs, you’re going to end up doing a whole lot of requirements elicitation.  Business-facing testing is as much about distilling requirements and testing assumptions as it is about testing software. 

There’s an entire body of knowledge on requirements analysis, design analysis, and modeling that can help.  Consider brushing up on UML.  Read Gause and Weinberg’s Exploring Requirements: Quality Before Design.  Learn how to be an effective interviewer and facilitator.  In short, grow both your analysis skills and soft skills.

Finally, no matter whether focusing on technology-facing or business-facing testing, never, ever forget that testing is a means to an end, not an end in and of itself.  Testers are successful only when we provide useful, accurate information that our stakeholders can use to improve the outcome of a project.