Our product owner participates in planning meetings before each iteration. Nevertheless, after the iteration has started and we discuss more details about the stories and how to test them, he often brings up an idea that didn’t come out during the planning, such as, “Well, it would really be nice if the selection on this report could include X, Y, and Z and be sorted on A as well.” An innocent request can add a lot of complexity to a story. I often bring in one of the programmers to talk about whether this addition can be handled within the scope of the story we had planned. If not, we ask the product owner to write a card for the next iteration.
—Lisa
Agile testers stay focused on the big picture. We can deliver the most critical functionality in this iteration and add to it later. If we let new features creep in, we risk delivering nothing on time. If we get too caught up with edge cases and miss core functionality on the happy path, we won’t provide the value the business needs.
Lisa’s Story
To ensure that we deliver some value in each iteration, our team looks at each story to identify the “critical path” or “thin slice” of necessary functionality. We complete those tasks first and then go back and flesh out the rest of the features. The worst-case scenario is that only the core functionality gets released. That’s better than delivering nothing or something that works only halfway.
—Lisa
Agile testers take the same approach as that identified in Lisa’s story. While one of our skills is to identify test cases beyond the “happy path,” we still need to start by making sure the happy path works. We can automate tests for the happy path, and add negative and boundary tests later. Always consider what adds the most value to the customer, and understand your context. If an application is safety-critical, adding negative tests is absolutely required. The testing time needs to be considered during the estimation process to make sure that enough time is allotted in the iteration to deliver a “safe” feature.
Enable Face-to-Face Communication
No team works well without good communication. Today, when so many teams are distributed in multiple geographical locations, communication is even more vital and more of a challenge. The agile tester should look for unique ways to facilitate communication. It is a critical aspect to doing her job well.
Janet’s Story
When I was working with one team, we had a real problem with programmers talking with the product owner and leaving the testers out of the discussion. They often found out about changes after the fact. Part of the problem was that the developers were not sitting with the testers due to logistical problems. Another problem was history. The test team was new, and the product owner was used to going straight to the programmers.
I took the problem to the team, and we created a rule. We found great success with the “Power of Three.” This meant that all discussions about a feature needed a programmer, a tester, and the product owner. It was each person’s responsibility to make sure there was always a representative from each group. If someone saw two people talking, they had the right to butt into the conversation. It didn’t take very long before it was just routine and no one would consider leaving the tester out of a discussion. This worked for us because the team bought into the solution.
—Janet
Any time there is a question about how a feature should work or what an interface should look like, the tester can pull in a programmer and a business expert to talk about it. Testers should never get in the way of any direct customer-developer communication, but they can often help to make sure that communication happens.
Agile testers see each story or theme from the customer’s point of view but also understand technical aspects and limitations related to implementing features. They can help customers and developers achieve a common language. Business people and software people often speak different languages. They have to find some common ground in order to work together successfully. Testers can help them develop a shared language, a project dialect, or team jargon.
Brian Marick (2004) recommends that we use examples to develop this language. When Lisa’s team digresses into a philosophical discussion during a sprint planning meeting, Lisa asks the product owner for an example or usage scenario. Testers can encourage whiteboard discussions to work through more examples. These help the customers envision their requirements more clearly. They also help the developers to produce well-designed code to meet those requirements.