Jeremy Jarrell

How to Get More Out of Your Iteration Review

Productivity

Is your iteration review just a dog-and-pony show for your stakeholders? If so, then you may be missing out on one your most important opportunities to ensure that you’re on track to delivering a great product.

It’s not a demo—it’s a review

The root of the problem often lies in the habit of referring to the iteration review as a “demo.” Demos are one-sided meetings where your team walks through the highlights of your product in a carefully scripted manner to a disinterested set of observers.

Reviews, on the other hand, are hands-on conversations between your team and your stakeholders to discuss the work they’ve done so far and how it matches up to your stakeholder’s vision. A good review lets your stakeholders test drive the work your team has done in as close to a real-world environment as possible. Only then will the team get the feedback it needs to adapt their plan moving forward to deliver a great product.

So, how do you ensure that you’re making the most of this opportunity? Here are a few tricks to get you started.

Picture of workers talking.

Get the right feedback

Believe it or not, many stakeholders are just as uncomfortable in their first review as the team is. Some may worry about hurting the team’s feelings after a strong effort, while others may worry about looking silly when they offer their opinion to the team they’ve hired to be the “experts.” As the facilitator of the review, it’s your job to put them at ease when offering their feedback and to help them understand what types of feedback would be the most helpful to you and your team.

First, let the stakeholders know that the entire purpose of this session is to get their feedback, so they should not hesitate to give it. After all, without their feedback you won’t know when you’ve misinterpreted their needs and need to adjust course.

Second, coach them on the specific types of feedback that would be most helpful. This type of feedback may change depending on the product that you’re building, but these are some common examples:

Flow: Does the path the user takes through the app make sense? Is there a better path? Messaging: Is the language the team is using in the app appropriate to the product? This can be particularly important in products that target a specific business domain, such as health care, to ensure that users understand the features as quickly as possible. Coherence: Is everything that the team is showing useful and providing value? Is there anything that could be removed?

Giving stakeholders examples of the specific types of feedback that you’re looking for often makes them more likely to provide useful feedback during the review. And although stakeholders tend to start with the specific types that you prompt them with, they’ll rarely constrain themselves to it for long.

Finally, rather than simply asking stakeholders yes/no questions, try to elicit this feedback in an open-ended manner. For example, rather than asking if the messaging that you’re using in the app is appropriate, ask if there’s more appropriate messaging that could be used instead. Or, if you are seeking more general feedback for a portion of the app, do not simply accept, “Yes, this looks good.” Continue digging until you understand why that portion of the app looks good and why it’s solving the problem at hand. This will allow you to replicate the same experience elsewhere in the app so the entire app provides the same level of value.

Plan ahead

Although we want iteration reviews to be a two-way conversation rather than a scripted demo, that doesn’t mean that we can’t start with a little bit of planning. Make a list of the key topics you want to get feedback on before the review. These may be new features that have been added to the product, significant enhancements, or particularly nasty bugs that were fixed. But remember that your stakeholders’ time is just as important as yours, so be sure to prioritize the list to tackle the most important topics first.

Once the list is complete, send it to all of your potential attendees before the review to encourage the right people to attend. Sharing the topics ahead of time will increase the chances of getting the right people in the room, which will get you the right feedback that you need to deliver a great product.

But while a little upfront preparation can be helpful, be careful that your list of topics does not evolve into a script. Carefully scripted iteration reviews yield little value since they don’t accurately represent the product in use. Interact with your product casually during the review just as you would in a day-to-day scenario. This will better convey the actual working state to your stakeholders. A more casual approach to your interaction will also convey to them that their feedback is invited at any time, while a carefully scripted iteration review will appear more formal and encourage attendees to hold their feedback until the end of your presentation rather than in the context of the review it pertained to.

Above all, keep the review casual and informal. Your users won’t be using the product from a script…so, why should you?

Picture of a team laughing together.

Use real data

Just as we want to interact with the product in a meaningful way, we’ll also want to do so using real data. All too often, we prime products with carefully scrubbed test data that’s been painstakingly curated to contain only the elements that we’re certain the product can handle. Unless your users are also painstakingly scrubbing any data they feed into your product, this will give an inaccurate representation of the state of your product to your stakeholders.

Strive to conduct every iteration review with as authentic test data as possible and use each review to reconfirm with your stakeholders that the data you are showing is representative of the actual data they will be working with. Unless you explicitly call it out, many stakeholders will assume that you’re aware that your data is not indicative of what the product will be working with in the real world. Encourage them to call this out so you can get the test data you need.

Coming attractions

A common question for new teams embarking on their first review is whether or not they should review unfinished work. Although not everything the team creates is worth reviewing along the way, if it’s an important piece of functionality, then you should take the opportunity to put it in front of your stakeholders as soon as possible. This “first look” is your chance to confirm that you haven’t completely misunderstood the overarching goal of the feature before you get too far down the wrong path.

However, reviewing unfinished work can be a bit tricky—especially with stakeholders who are new to an agile process. If you do decide to review unfinished work, be sure to make it clear that this work is still in progress and that you’re most interested in feedback concerning whether or not the team has correctly understood the problem and if they’re on the right track with the solution. As with other pieces of the review, coaching stakeholders on the type of feedback that is most useful here is critical.

Inspecting and adapting

Iteration reviews are one of the most important pieces of an agile process since they allow teams to get to the critical feedback they need to deliver a great product. Without genuine feedback, the team has nothing to adapt to and runs the risk of continuing down the wrong path without an opportunity for correction. It’s your team’s responsibility to make the review as authentic and productive as possible so you can get the right feedback from your stakeholders and deliver the right product to market.


Jeremy Jarrell is an agile coach who helps teams get better at doing what they love. When he’s not mentoring Scrum Masters or Product Owners, Jeremy loves to write on all things agile. You can read more of his thoughts at www.jeremyjarrell.com, see his videos at Pluralsight, or follow him on Twitter @jeremyjarrell.

Category:

Tags: