One of the components of the system that is often overlooked during testing is documentation. As agile developers, we may value working software over documentation, but we still value documentation! User manuals and online help need validation just as much as software. Your team may employ specialists such as technical writers who create and verify documentation. As with all other components of the product, your whole team is responsible for the quality of the documentation, and that includes both hard copy and electronic.
User Documentation
Your team might do Quadrant 2 tests to support the team as they produce documentation; in fact we encourage it. Lisa’s team writes code that produces documents whose contents are specified by government regulations, and programmers can write much of the code test-first. However, it’s difficult for automated tests to judge whether a document is formatted correctly or uses a readable font. They also can’t evaluate whether the contents of documents such as user manuals are accurate or useful. Because documentation has many subjective components, validating it is more of a critiquing activity.
Janet’s Story
Technical writers and testers can work very closely together. Stephanie, a technical writer I worked with on one project, talked with the programmers to understand how the application worked. She would also work through the application to make sure she wrote it down correctly. This seemed to be a duplication of the testing effort, so Stephanie and I sat down and figured out a better approach.
We decided to work together on the stories as they were developed. For some stories Stephanie was lead “tester,” and sometimes I took that role. If I was lead, I’d create my test conditions and examples and Stephanie would use those as her basis for the documentation. When Stephanie was lead, she would write her documentation, and then I would use that to determine the test cases.
Doing it this way enabled the documentation to be tested and the tests to be challenged before they were ever executed. Working hand in hand like this proved to be a very successful experiment. The resulting documentation matched the software’s behavior and was much more useful to the end users.
—Janet
Don’t forget to check the help text too. Are the links to help text easily identifiable? Are they consistent throughout the user interface? Is the help text presented clearly? If it opens in a pop-up, and users block pop-ups in their browsers, what’s the impact? Does the help cover all of the topics needed? On Lisa’s projects, help text tends to be a low priority, so it often doesn’t get done at all. That’s a business decision, but if you feel an area of the application needs extra help text or documentation, raise the issue to your team and your customers.
Reports
Another system component that’s often overlooked from a testing perspective is reports. Reports are critical to many users for decision-making purposes but are often left until the very end, and either don’t get done or are poorly executed. Reports might be tailored to meet specific customer needs, but there are many third-party tools available for generating reports. Reports may be part of the application itself or be generated through a separate reporting system for end users.
We discuss testing reports along with the other Quadrant 3 test activities in order to critique the product, but we recommend that you also write Quadrant 2 report tests that will guide the coding and help the team understand the customer’s needs as it produces reports. They can certainly be written test-first. Like documents, though, you need to look at a report to know if it’s easy enough to read and presents information in an understandable way.
One of the biggest challenges when testing reports is not the formatting but getting the right data. When you try to create test data for reports, it can be difficult to get a good cross section of realistic data. It also is usually the edge cases that make the reports fail, so incorporating that extra data is not feasible. In most cases, it’s best to use production data (or data copied from the production system into a test environment) to test the different reporting variations.
Lisa’s Story
Our application includes a number of reports, many of which help companies meet governmental compliance requirements. While we have automated smoke tests for each report, any change to a report, or even an upgrade in the tool we use to generate reports, requires extensive manual and visual testing. We have to watch like hawks: Has a number been truncated by one character? Did a piece of text run over to the next page? Is the right data included? Wrong or missing data can mean trouble with the regulatory agency.