Thinking Outside the Explicitly Defined Acceptance Criteria Box

When interviewing candidates for testing positions, I often describe a program, then ask the candidate to tell me how they would test it.  I recall one candidate who came up with a couple test cases, then sat back in his chair and said, with a relaxed smile on his face, “and that’s that.”

“That’s that?” I asked.

“Yup,” he replied.  “That’s all I’d need to test.”

He didn’t get the job.  Complacency is dangerous in a tester.

I expect testers to think about the various ways the software might be used or misused.  And I look for that knack when I interview.

A couple test cases just can’t cover all the potential variations in data, sequences, timing, configurations, and operating conditions that might affect the behavior of the software.  A tester who brushes his hands together declaring, “That’s enough” after two test cases isn’t even aware there could be more.  If he’d made an agonized face and said, “there should be more to test but I can’t think of what it is right now,” it would have been better than a self-satisfied “that’s that.”

Out in the real world, real users will do things even the most diabolical tester would be hard-pressed to imagine.  They’ll try to “undo” actions by hitting back or refresh in the browser; they’ll move message windows all the way off the screen “to get them out of the way” and then leave them there; and they’ll power down the machine to silence an “annoying” (but critical) alert.  A tester who can’t think beyond the most mundane tests has no hope of discovering how the software might behave when subjected to such treatment.

At best, a “that’s that” tester will provide extremely limited information about how the software will behave under optimal conditions.  Unfortunately, it’s all too likely that superficial testing will instill the team with a false sense of confidence.  In the worst case scenario, the team will ship seriously flawed software with absolutely no idea how badly it will fail in the field.  It’s a disaster waiting to happen.

Unfortunately, there are some people who think that the entire role of a tester is to verify that the implementation matches the written requirements and/or specifications.  Whether or not they realize it, such people are advocating “that’s that” testing as general policy.

I wish I could say that all such people I’ve encountered were old-school traditionalists who also think Big Design Up Front is super cool.  Unfortunately, there are plenty of Agilists who think testers should stick to executing overly simplistic Acceptance Tests.

“If it isn’t part of the Story Acceptance Criteria,” they say, “we shouldn’t be testing for it.  It’s out of scope.”

When they’re feeling generous, they add, “Make a new Story for it.”

It seems to me that such an approach to Agile development isn’t terribly Agile.

Agilists value working software over comprehensive documentation.  Tests tell us how well the software works under various conditions.  So it’s not surprising that all Agilists I’ve spoken to claim to value testing.  And I believe that most actually do.  Unfortunately, some only value those tests that demonstrate the software “works.”

So to those who think tests should only show that the software “works,” I ask: If you limit the scope of testing to demonstrating the software “works,” how are you going to discover the circumstances under which it doesn’t?

Subscribe

Subscribe to our e-mail newsletter to receive updates.

3 Responses to Thinking Outside the Explicitly Defined Acceptance Criteria Box

  1. Jared January 2, 2007 at 11:24 pm #

    My comment to developers has been that if they see their job as to simply implement what’s on the card, their job might as well go to the cheapest offshore supplier 🙂

  2. José Alejandro Betancur January 4, 2007 at 7:35 am #

    “Complacency is dangerous in a tester.”

    I agree with you, I hate when a tester thinks that he has created all the possible test cases or scenarios for one use case, and then you read it, and you find out that every line is just a vague idea of what you need to test.

  3. Michael Bolton February 27, 2007 at 8:32 pm #

    Short and sweet: amen.

    —Michael B.