Another challenge is verifying the data contained in the report. If I were to use the same query that the report uses, it doesn’t prove anything. I sometimes struggle to come up with my own SQL queries to compare the actual data with what shows up on a report. We budget extra time to test reports, even the simple-looking ones.
Because reports are so subjective, we find that different stakeholders have different preferences for how the data is presented. The plan administrator who has to explain a report to a user on the phone has a different idea of what’s easy to understand than the company lawyer who decides what data needs to be on the report. Our product owner helps us get consensus from all areas of the business.
The contents and formatting of a report are important, of course, but for online reports, the speed at which they come up is critical too. Our plan administrators wanted complete freedom to specify any date range for some transaction history reports. Our DBA, who coded the reports, warned that for a large company’s retirement plan, data for more than a few months worth of transactions could take several minutes to render. Over time, companies grew, they had more and more transactions, and eventually the user interface started timing out before it could deliver the report. When testing, try out worst-case scenarios, which could eventually become the most common scenario.
—Lisa
If you’re tackling a project that involves lots of reports, don’t give in to the temptation to leave them to the end. Include some reports in each iteration if you can. One report could be a single story or maybe even broken up into a couple of stories. Use mock-ups to help the customers decide on report contents and formatting. Find the “thin slice” or “critical path” in the report, code that first, and show it to your customer before you add the next slice. Incremental development works as well with reports as it does with other software.
See Chapter 8, “Business-Facing Tests that Support the Team,” for information about using thin slices.
Sometimes your customers themselves aren’t sure how a report should look or how to approach it incrementally. And sometimes nobody on the team anticipates how hard the testing effort will prove to be.
Lisa’s Story
Like other financial accounts, retirement plans need to provide periodic statements to account holders that detail all of the money going into and out of the account. These statements show the change in value between the beginning and ending balances and other pertinent information, such as the names of account beneficiaries. Our company wanted to improve the account statements, both as a marketing tool and to reduce the number of calls from account holders who didn’t understand their statements.
We didn’t have access to our direct competitors’ account statements, so the product owner asked for volunteers to bring in account statements from banks and other financial institutions in order to get ideas. Months of discussions and experimentation with mock-ups produced a new statement format, which included data that wasn’t on the report previously, such as performance results for each mutual fund.
Stories for developing the new account statement were distributed throughout two quarters worth of iterations. During the first quarter, stories to collect new data were done. Testing proved much harder than we thought. We used FitNesse tests to verify capturing the different data elements, which lulled us into a false sense of security. It was hard to cover all of the variations, and we missed some with the automated tests. We also didn’t anticipate that the changes to collect new data could have an adverse effect on the data that already displayed on the existing statements.
As a result, we didn’t do adequate manual testing of the account statements. Subtle errors slipped past us. When the job to produce quarterly statements ran, calls started coming in from customers. We had a mad scramble to diagnose and fix the errors in both code and data. The whole project was delayed by a quarter while we figured out better ways to test and added internal checks and better logging to the code.
—Lisa
Short iterations mean that it can be hard to make time for adequate exploratory testing and other Quadrant 3 activities. Let’s look at tools that might help speed up this testing and make time for vital manual and visual tests.
Tools to Assist with Exploratory Testing
Exploratory testing is manual testing. Some of the best testing happens because a person is paying attention to details that often get missed if we are following a script. Intuition is something that we cannot make a machine learn. However, there are many tools that can assist us in our quest for excellence.