Gerard Meszaros, a Certified ScrumMaster (Practicing) and Agile Coach, shared his story about Wizard of Oz Testing on Agile Projects. He describes a good example of how artifacts we generate to elicit requirements can help communicate meaning in an unambiguous form.
We thought we were ready to release our software. We had been building it one iteration at a time under the guidance of an on-site customer who had prioritized the functionality based on what he needed to enter into integration testing with his business partners. We consciously deferred the master data maintenance and reporting functionality to later iterations to ensure we had the functionality needed for integration testing ready. The integration testing went fine, with just a few defects logged (all related to missing or misunderstood functionality). In the meantime, we implemented the master data maintenance in parallel with integration testing in the last few iterations. When we went into acceptance testing with the business users, we got a rude shock: They hated the maintenance and reporting functionality! They logged so many defects and “must-have improvements” that we had to delay the release by a month. So much for coming up with a plan that would allow us to deliver early!
While we were reimplementing the master data maintenance, I attended the Agile 2005 conference and took a tutorial by Jeff Patton. One of the exercises was building paper prototypes of the UI for a sample application. Then we “tested” the paper prototypes with members of the other groups as our users and found out how badly flawed our UI designs were. Déjà vu! The tutorial resembled my reality.
On my return to the project back home, I took the project manager I was mentoring in agile development aside and suggested that paper prototyping and “Wizard of Oz” testing (the Wizard of Oz reference is to a human being acting as a computer—sort of the “man behind the curtain”) might have avoided our one-month setback. After a very short discussion, we decided to give it a try on our release 2 functionality. We stayed late a couple of evenings and designed the UI using screenshots from the R1 functionality overlaid with hand-drawn R2 functionality. It was a long time since either of us had used scissors and glue sticks, and it was fun!
For the Wizard of Oz testing with users, we asked our on-site customers to find some real users with whom to do the testing. They also came up with some realistic sample tasks for the users to try to execute. We put the sample data into Excel spreadsheets and printed out various combinations of data grids to use the in the testing. Some future users came to town for a conference. We hijacked pairs of them for an hour each and did our testing.
I acted as the “wizard,” playing the part of the computer (“it’s a 286 processor so don’t expect the response times to be very good”). The on-site customer introduced the problem and programmers acted as observers, recording the missteps the users made as “possible defects.” After just a few hours, we had huge amounts of valuable data about which parts of our UI design worked well and which parts needed rethinking. And there was little argument about which was which! We repeated the usability testing with other users when we had alpha versions of the application available and gained further valuable insights. Our business customer found the exercise so valuable that on a subsequent project the business team set about doing the paper prototyping and Wizard of Oz testing with no prompting from the development team. This might have been influenced somewhat by the first e-mail we got from a real user 30 minutes after going live: “I love this application!!!”
Developing user interfaces test-first can seem like an intimidating effort. The Wizard of Oz technique can be done before writing a single line of code. The team can test user interaction with the system and gather plenty of information to understand the desired system behavior. It’s a great way to facilitate communication between the customer and development teams.
Close, constant collaboration between the customer team and the developer team is key to obtaining examples on which to base customer tests that drive coding. Communication is a core agile value, and we talk about it more in the next section.