New Public Classes Announced

Shameless plug alert!

Dale Emery and I are offering the 3-day Agile Testing Series of classes in Pleasanton, CA on April 22 – 24, 2009, and in Portland, OR on April 28 – 30, 2009.

We’ve structured the three days as a la carte offerings. You can take one, two, or all three days.

We have discounted pricing right now: early bird pricing is just $499/day instead of the usual $599. In addition, if you register 2 or more people for the same class simultaneously, you can take advantage of our 20% group discount. And we’re offering a “Buy 3 get 1 free” Guest Pass offer where if you buy the whole series, you’ll get a Guest Pass to give to a friend or colleague so they can join in the fun (some restrictions apply, see terms and conditions for details).

And if that’s not enough, I’ve created a limited coupon offer for my blog readers. The first 5 people who register for each class using the coupon code “testobsessed” will receive an extra 10% discount on the class fee.

The #notagile Twitter Stream

It had already been a frustrating morning. Too little sleep, and too little coffee. Squabbling kids. Traffic. Parents who didn’t know how to use the carpool lane at school and just stopped in the middle of the road, holding up traffic for 5 minutes while Junior fetched his lunchbox from the depths of the giant SUV. By the time I got to work, I had my cranky pants on.

And so when I read through a series of messages in some online forum or another in which some poor newbie was asking questions about Agile that indicated that they Just Didn’t Get It, I had to bite my tongue. I was not going to take out my frustrations with my morning commute on some poor unsuspecting person asking a perfectly legitimate, if naive, question.

But I couldn’t just ignore it. The wrongness of the assumptions in the message nagged at me, distracted me, kept me from focusing on the stack of work I intended to do.

So I wrote the following on Twitter:

If the org has 1-1 map btwn old vocab & new (e.g. WBS==sprint backlog, PM==Scrum Mastr, status meeting==Daily Scrum) they’re doin’ it wrong.

Turns out that was just the beginning. It seems there’s no end of ideas about how NOT to do Agile. I soon found myself writing about misinterpretations of TDD:

If the org thinks TDD means the testers write test documents before coding starts, they’re doin’ it wrong.

If the devs think TDD means writing *all* the unit tests in advance, and only then writing the production code, they’re doin’ it wrong.

@qualityfrog picked up on that with his contribution:

If those without title ‘developer’ aren’t included as team members, they’re doing it wrong.

Then @woodybrood suggested I should tag them. Thus, the #notagile tag was born. My most recent #notagile entries:

If an Agile team’s typical cycle time for critically important PO-requested fixes is measured in months, they’re doin’ it wrong. #notagile

If the Agile process has a reqs sprint and a design sprint and a code sprint and a test sprint, they’re doin’ it wrong. #notagile

I’m finding that having a place to put How Not to do Agile ideas is helping me. It’s cathartic. I’m not finding myself distracted the urge to write long diatribes on my blog anymore. And I’m pretty sure it’s lowering my blood pressure.

Follow along or feel free to contribute your own to the Twitter stream (tag your tweet with #notagile).

Apparently TriMet Thinks I Can Walk on Water

I’m planning all the little logistical details for my upcoming trip to Portland, Oregon to talk with the XPDX group about ATDD. Details like how I’ll get from point A to point B.

The TriMet site for the fabulous lightrail (that I am hoping will make it possible for me to cancel that car reservation after all) provides an interactive trip planner. There’s just one small problem:

TriMet telling me to walk across the water

Apparently TriMet thinks I can walk on water. Alas, that would be incorrect. Methinks I’d better plan on taking the bridge.


On February 5, a group of us gathered in my office on First Street in Pleasanton for the first ever “Open Source Test Automation Tool Love In” (OSTATLI). Joining in were: Dale Emery, Jeffrey Fredrick, Kevin Lawrence, Dave Liebreich, Ken Pier, and Chris Sims.

Several folks have asked me to post the results of the meeting, and others have asked me how to host one in their area.

I’ve been pleasantly surprised by the strong response to OSTATLI. At its core, the gathering was just an excuse for like-minded folks to spend a day geeking out together. Perhaps the interest was because of the name?

In any case, I must confess that I find it a little hard to post “the” results for the meeting. It’s just not the type of meeting that leads to just one set of results.

However, here’s my best attempt to provide a glimpse into the day from my perspective.

After our initial chitchat, and a brief moment in which I panicked while Chris convinced the wifi that it it wanted to allow us access to the Internet, we set some intentions for the day. OSTATLI intentions

Some were about exploring tools, like:

  • Play with Cucumber
  • Get feedback on FIT v. NUnit
  • Understand Selenium
  • A taste of RSpec
  • The cool tool that I’ve never heard of but should

Others were specific tasks, such as:

  • Get SafariWatir testing my local WordPress instance “the right way”
  • Use Robot Framework to create an initial set of Acceptance Tests for “VDL”

And we captured a list of the tools represented in the room, either tools about which someone was curious, or tools around which someone had expertise: List of Tools

We spent the rest of the morning on projects related to our stated intentions, working in small groups. After lunch, we gathered around the projector for a demo by Ken of a custom test harness that SocialText has written around Selenium. Then Kevin did an impromptu demo of Cubic Test, and we all oohed and aahed over the graphical test representation.

A few folks had to leave early, and the rest of us talked for a while in the office, until we remembered that there’s a pub around the corner that would allow us to talk about geek stuff with a beer in our hands.

Overall, I think it was a fabulous success, tons of fun, and I am most grateful to all the participants and to everyone who has expressed interest.

For another perspective on OSTATLI, see Dave Liebreich’s account.

Also, I’m delighted that the idea is catching on. Al Snow is already planning a similar kind of meeting in the Atlanta area.

Community Help on a Marketing Message?

I’m in the process of writing marketing copy for a series of public classes that Dale Emery and I will be offering on April 22 – 24 in Pleasanton, CA and April 28 – 30 in Portland, OR.

(I am also still working on getting the registration system to bend to my will, so you can’t register just yet. I’ll post something here when the registration system finally goes live…soon, soon…)


I was wondering if I could impose on the collective wisdom of the general community to help me hone my message?

First, some background…

The classes are a 3-day series of Agile Testing related classes.

  • Day 1 is “Adapting to Agile,” also known as The WordCount Simulation.
  • Day 2 is “Acceptance Test Driven Development (ATDD) in Practice”
  • Day 3 is “Exploratory Testing in an Agile Context”

The classes can be taken individually, or in combination.

From a marketing perspective, I want to include a little Question and Answer style blurb about what days someone should plan to take. For example:

I’m in a QA/QE/Testing group, and my organization is adopting Agile. This is all new to me. Where should I start?

We strongly recommend that you take all three classes in this series.

The first day, “Adapting to Agile,” will give you an opportunity to experience an Agile transformation and see how the whole team (not just testers) adapts testing-related activities as the context changes. Along the way, you’ll learn how test activities can support Agility by increasing visibility and feedback. And you’ll learn how to spot waste and focus on customer value in testing.

The second day, “Acceptance Test Driven Development (ATDD) in Practice,” will give you an understanding of how test specialists can contribute throughout the development cycle. ATDD might also address the concerns you might have around how you’ll be able to derive tests without having requirements documents.

The third day, “Exploratory Testing in an Agile Context,” will teach you about applying Session-Based Exploratory Testing as part of a sprint or iteration. It will also answer concerns you might have around how an Agile team tests for the risks and vulnerabilities that are not covered by the automated tests.

And as long as I’m doing that, I wanted to extend the idea to include questions and answers aimed at convincing team members that Agile Testing classes aren’t just for testers. To that end, I wrote the following:

I’m a Developer. Are these classes just for testers?

Nope! We want you to participate! Please join us!

As a Developer on an Agile team, you contribute a great deal to the testing-related activities. These classes will help you learn how to collaborate with testers and business stakeholders on various testing-related activities to ensure that the whole team is getting the feedback they need to keep the project moving forward. These classes also might help you convince other people in your organization that testing activities are a shared responsibility on an Agile team.

I’m a Business Analyst/Product Owner/XP Customer. Should I come to these classes? If so, which one?

If you are responsible for defining what the software should do on an Agile project, then you are also ultimately responsible for accepting the software. And yet you don’t have time to test it thoroughly by yourself. You need the help and support of the technical team to be sure when you accept software that it meets your expectations. The practice of Acceptance Test Driven Development is particularly important for that. So if you can come to only one day of these classes, we recommend you come for Day 2, the ATDD class.

If you can come to two days, we recommend that you also take the “Adapting to Agile” class because it will allow you to explore the connection between stories and acceptance tests in a microcosm.

Of course, if you can come for all three days we think you’ll find it very worthwhile. The third day on Exploratory Testing will give you ideas for ways to explore the emerging system to ensure that it really does meet your needs.

Really? I should sign up? But I’m not in QA, and I’m not a Tester. I’m a …

Please excuse us for interrupting.

We hear this a lot: “Great! You have an Agile Testing class! I’ll send my QA department!” There is an unfortunate implicit assumption that the only people who have to worry about testing are the designated Testers or QA people.

In an Agile context, everyone tests.

So no matter what role you play, if you have a role on an Agile team, you have some testing-related responsibilities. Whether you are an Architect, Developer, Database Designer, UI Designer, Technical Writer, Product Owner, Scrum Master, XP Customer, Tester, QA Manager, Quality Engineer, SDET, Build Meister, Configuration Manager, Team Lead, Test Lead, Development Manager, etc., if you’re working on an Agile team that delivers software, you need to know how testing activities in Agile help move the project forward. And you need to be prepared to play your part in ensuring that the software is adequately tested before calling it “Done.”

And by the way, if you are thinking, “I’m not a tester, so I don’t need these classes,” then you REALLY need these classes to find out what you’re missing.

Before I publish all this prose as part of the marketing materials, I would really like some community feedback. What do you think? Is it worth publishing this kind of thing? Is that last little bit too harsh? Too annoying? Is it convincing? What would make it more convincing?

Thanks for any feedback you care to give me.

How Much to Automate? Agile Changes the Equation

The subject of how much to automate, and the related topic of how to calculate the ROI for test automation, comes up on a regular basis. In fact, it popped up on a couple of the mail lists I read recently.

Usually there’s at least one person arguing that test automation is expensive and that there are situations in which it just doesn’t make sense, so we should automate selectively. We should pick and choose, they say. We should automate wisely. We should automate only those tests where the investment is justified. The rest will stay manual, and that’s only sensible, they say.

I understand their concern.

In some contexts, particularly where there is a legacy code base that was created without automated tests, the cost to create and maintain each automated test is extraordinarily high.

Further, the value of those tests is often less than it could be.

The value in any test is in the information that it provides. But when many of the test failures are because the tests, and not the code, are wrong, the information provided by the whole suite of tests is deemed unreliable and untrustworthy. Information only has value to the extent that we can trust it.

Thus, the automated tests in that kind of context are both insanely expensive and low in value. Some years ago this was the norm. In many organizations, sadly, this is still the norm.

But it doesn’t have to be that way.

Successful Agile teams typically follow at least a subset of the XP development practices like TDD and Continuous Integration.

Oh, sure, you can be Agile without doing TDD. But by and large effective Agile teams do at least some of the XP practices.

And teams that do practice TDD and ATDD wind up with large suites of automated tests as a side effect.

(Yes, it’s a side effect. Contrary to what some people think, TDD is a design technique, not a testing technique. We do TDD because it leads to clean, well-factored, malleable, testable code. The automated tests are just a nice fringe benefit. Similarly ATDD is more about requirements than about testing. ATDD drives out ambiguity early, helps us understand the full scope of a given story, and ensures that we have a shared understanding of what “Done” looks like. But I digress.)

Moreover, the resulting set of automated tests is so much more valuable because the test results are typically reliable and trustworthy. When the tests pass, we know it means that the code still meets our expectations. And when the tests fail, we’re genuinely surprised. We know it means there’s something that broke recently and we need to stop and fix it.

And we get that information frequently. Developers writing code execute the unit tests every few minutes, and are thus able to tell almost immediately when they’ve broken something that used to work. Similarly, the Continuous Integration system executes the unit and acceptance regression tests with every code check in. All those automated tests result in incredibly fast feedback.

Anyone who has taken an economics or business class is probably aware of the time-value of money. Net Present Value says that money in your hand today is worth more than that same amount of money in your hand tomorrow.

The same is true of information. Information sooner is worth more than the same information later. Automated tests executed through a continuous integration system give us a lot of information fast, and that has enormous value.

So Agile teams that have solid engineering practices in place typically find it that each incremental test costs very little to create and maintain, and the value of those tests are huge because the information is reliable and it’s delivered so quickly. In such a context it no longer makes sense to debate the ROI for a single given test. In the time we have the debate, we could just write the test.

People who are accustomed to living in the first context, where automation is hard to create, painful and costly to maintain, and doesn’t offer all that much value, often find it hard to imagine such a context. From their perspective, ROI is so very uncertain because the price is high and the value is low.

But instead of refining our ROI arguments, I suggest changing the equation. Adopt development practices that lower the cost of automation and increase the value so much that we just don’t ever have to argue about whether or not a given test is worth automating.

Virtual Training Update

So it’s the start of a new year, and as part of my strategic plan for 2009 I’m picking up where I left off with my experiments in virtual training.

I’m not ready to offer virtual training commercially just yet. But I am getting closer. I’m still figuring out packaging and pricing. And I’m still thinking about additional value-added materials and exercises that would make my virtual offerings much richer.

In the meantime I need more feedback to be sure I’m headed in the right direction. And since I’m Test Obsessed, that means more testing.

That’s where you come in.

Interested in getting free virtual training in exchange for brutally honest feedback and (if earned) a testimonial? I’d like to talk to you. In particular, I am looking for early feedback from people who:

  • Are individual contributors, leads, or managers actively working on software projects.
  • Have attended at least one instructor-led software development or testing related class in the last 5 years.
  • Work in an organization that supports staff members participating in training and/or conferences.
  • Are in a position to judge whether or not your organization would consider paying for this kind of training. (Note that I am not going to try to sell you anything; this is not a thinly veiled sales tactic. However, I need honest feedback about the commercial viability of what I have in mind from people who have real insight into current spending decision criteria around training and classes within their organizations.)
  • Are interested in one of the following topics: Acceptance Test Driven Development (ATDD); Agile Testing; and/or Test Design/Analysis Techniques (such as all-pairs, state models, etc.).
  • Are willing to invest time and effort to help me figure out what I need to do to make my virtual training totally rock.

To participate in my virtual training you will need access to a relatively recent PC or Mac with highspeed internet access, a microphone + headphones or a headset, and a web cam. (Linux *might* work and I am happy to experiment with folks if they have patience.)

Interested? Please send me an email. I’ll respond with a set of questions designed to get to know more about you, including questions about times and topics that would fit for you. I plan to hold these test sessions over the next couple of weeks.