We started a project to replace our company’s interactive voice response (IVR) system, which allows retirement plan participants to obtain account information and manage accounts by phone. We contracted with another company to write the system in Java, with the intention that our team would maintain it after a certain time period.
We spent some time brainstorming what testing would be needed and how to do it. Presumably, the contractor would test things like the text-to-speech functionality, but we had to supply stored procedures to retrieve appropriate data from the database.
Our first step was to negotiate with the contractor to deliver small chunks of features on an iteration basis, so they could be tested as the project progressed and the work would be spread out evenly over the life of the contract. We decided to test the stored procedures using FitNesse fixtures, and explored the options. We settled on PL/SQL to access the stored procedures. A programmer was tasked with getting up to speed on PL/SQL to tackle the test automation.
The team aimed for a step-by-step approach. By allocating plenty of time for tasks at the start, we allowed for the steep learning curves involved.
Interestingly, the contractor delivered an initial build for the first iteration but was not able to deliver the increments of code for the next few iterations. We ended up canceling the contract and postponing the project until we could find a better solution. By forcing the contractor to work in increments, we discovered right away that it couldn’t deliver. What if we had let them take six months to write the whole application? It probably wouldn’t have ended well. We put what we learned to good use in researching a better approach.
—Lisa
When you’re embarking on something new to the team, such as a new templating framework or reporting library, remember to include it as a risk in your test plan. Hopefully, your team considered the testability before choosing a new framework or tool, and selected one that enhanced your ability to test. Be generous with your testing task estimates with everything new, including new domains, because there are lots of unknowns. Sometimes new domain knowledge or new technology means a steep learning curve.
Collaborate with Customers
Working closely with customers, or customer proxies such as functional analysts, is one of our most important activities as agile testers. As you kick off the iteration, your customer collaboration will also kick into high gear. This is the time to do all those good activities described in Chapter 8, “Business-Facing Tests that Support the Team.” Ask the customers for examples, ask open-ended questions about each story’s functionality and behavior, have discussions around the whiteboard, and then turn those examples into tests to drive coding.
Even if your product owner and/or other customers explained the stories before and during iteration planning, it’s sometimes helpful to go over them briefly one more time as the iteration starts. Not everyone may have heard it before, and the customer may have more information.
Lisa’s Story
We start writing high-level acceptance tests the first day of the iteration. Because we go over all stories with the product owner the day before the iteration and write user acceptance tests as a team for the more complex stories, we have a pretty good idea of what’s needed. However, the act of writing more test cases often brings up new questions. We go over the high-level tests and any questions we have with the product owner, who has also been thinking more about the stories.
One example of this was a story that involved a file of monetary distributions to plan participants who withdraw money from their retirement accounts. This file is sent to a partner who uses the information to cut checks to the participant. The amounts in some of the records were not reconciling correctly in the partner’s system, and the partner asked for a new column with an amount to allow them to do a reconciliation.
After the iteration planning meeting, our product owner became concerned that the new column wasn’t the right solution and brought up his misgivings in the story review meeting. He and a tester studied the problem further and found that instead of adding a new amount, a calculation needed to be changed. This was actually a bigger story, but it addressed a core issue with the distributions. The team discussed the larger story and wrote new task cards. It was worth taking a little time to discuss the story further, because the initial understanding turned out to be wrong.
—Lisa