Back-of-a-Napkin Agile Assessment

I am often asked: “How do I know if my team is really Agile? They claim they’re Agile, but I think they’re cheating.”

In response I usually ask a barrage of questions aimed at discovering how well the team is doing at delivering valuable and potentially shippable increments frequently (at least once a month), consistently (month after month after month), all while adapting to changing priorities and business needs.

To me, that’s the essence of Agile. It’s not about whether or not a team is doing TDD or CI or pairing or automated regression testing, although I do strongly believe that those are all good practices, and I evangelize them wherever I go.

Ultimately, being Agile means delivering business value frequently and consistently while adapting to changing business needs. No matter what practices we’re following, if we aren’t doing that, we’re not Agile.

So as I am working on a next-generation revision of the materials for the upcoming Agile Testing class that Dale Emery and I are co-leading, I decided that it would be nice to include an Agility self-assessment in the materials, enabling people to answer the question of whether or not their team is really Agile for themselves. And as long as I’m writing it down, I wanted to share it here and get feedback on it.

And so with no further ado, here’s my new back-of-the-napkin Agile assessment checklist:

  1. The team knows, for sure, that at any given time they are working on deliverables that have the greatest value for the business.
  2. When the implementation team claims to be Done with something, the business stakeholder usually agrees that it is, in fact, done and Accepts it.
  3. When something is Accepted, it is sufficiently well-built and well-tested that it would be safe to deploy or ship it immediately.
  4. The team delivers Accepted product increments at least monthly.
  5. When the product increments are shipped or deployed, the users and customers are generally satisfied.
  6. If the business stakeholder changes the priorities or the requirements, the implementation team can adapt easily, switching gears to deliver according to the updated business needs within the next iteration.
  7. The business stakeholders express confidence that they will get the capabilities they need in a timely manner.
  8. The business can recognize real value from the deliverables: each product increment ultimately has a positive impact on the bottom line.
  9. The team has been working at the same pace, delivering roughly the same amount every iteration, for a while.
  10. The people on the implementation team agree that they could keep working at the current pace indefinitely.

So how Agile is your team? How many of the statements above would you say characterize your team?

And while I’m asking questions, is there anything that you think should be added, removed, or modified?

7 thoughts on “Back-of-a-Napkin Agile Assessment

  1. I’d add:
    The team is allowed to do their best work, and continually strives to improve through retrospectives and experimentation
    To me, that’s the essence of Agile.

  2. Interesting!

    I recently listed the minimum practices I think you should be using to be successful with Agile ( ). Your list is different because of its self-assessment nature. Putting the two side-by-side I see that are complementary.

    My list is about things you should do, but doesn’t say why or how you know you need one of them. Your list says what you should be achieving but not what to do if you’re failing on one of the points.

    But if I could give a team one list or the other I think yours is a better starting point.

  3. I’m troubled by “knows, for sure” in question 1. That might require either ESP or more visibility into opaque parts of the organization than many teams have. Is a team not being Agile because there’s an argument going on over company direction–an argument that decides what’s valuable–several levels up? Or is it sufficient that the team believes the Product Owner’s current narrative?

  4. Similar to Lisa’s comment, there seems to be something missing in terms of continuous improvement especially the way item 9 is worded.

  5. How about something to do with collaboration and team members working in sync. Many of the teams I’ve worked with or talked to really do all the these things, but are doing mini waterfalls with the testers still testing after the fact.

  6. Same comment as Jason and Lisa. For me, a key premise behind succeeding on item 10 is that while the code base grows in size, the team continues to improve and find more productive ways to do things. If you find that not to be the case, you need more attention to retrospectives or a higher emphasis on innovation. Perhaps more slack time built-in to your card estimates or perhaps some “sharpen-the-saw” cards if you have an enlightened customer.

  7. I do testing on a software project where the team uses Scrum. The developer production has increased a lot. There is no slacking off and they are getting more work done. I believe the morning meetings and short deliverables make all the difference. The team does a deliverable every 2 weeks no matter what is not done. For testing, I have a different schedule, but I run parallel to the developers, testing things they have completed.

Comments are closed.