Figure 12-2 Remote data monitoring system architecture
The software application was a huge legacy system that had few unit tests. The team was slowly rebuilding the application with new technology.
The Team and the Process
The team consisted of four software programmers, two firmware programmers, three to four testers, a product engineer, and an off-site manager. The “real” customer was in another country. The development team uses XP practices, including pair programming and TDD. The customer team used the defect-tracking system for the backlog, but most of the visibility of the stories was through index cards. Story cards were used during iteration planning meetings, and the task board tracked the progress.
Scrum was used as the outside reporting mechanism to the organization and the customers. The team had two week iterations and released the product about every four months. This varied depending on the functionality being developed. Retrospectives were held as part of every iteration planning session, and action was taken on the top three priority items discussed.
Continuous integration through CruiseControl provided constant builds for the testers and the demonstrations held at the end of every iteration. Each tester had a local environment for testing the web application, but there were three test environments available to the system. The first one was to test new stories and was updated as needed with the latest build. The second one was for testing client-reported issues, because it had the last version released to the clients. The third environment was a full stand-alone test environment that was available for testing full deploys, communication links, and the firmware and hardware. It was on this environment that we ran our load and reliability tests.
Tests Driving Development
The tests driving development included unit test and acceptance tests.
Unit Tests
Unit tests are technology-facing tests that support programming. Those that are developed as part of test-driven development not only help the programmer get the story right but also help to design the system.
Chapter 7, “Technology-Facing Tests that Support the Team,” explains more about unit testing and TDD.
The programmers on the Remote Data Monitoring project bought into Test Driven Development (TDD) and pair programming wholeheartedly. All new functionality was developed and tested using pair programming. All stories delivered to the testers were supported by unit tests, and very few bugs were found after coding was complete. The bugs that were found were generally integration-related.
However, when the team first started, the legacy system had few unit tests to support refactoring. As process changes were implemented, the developers decided to start fixing the problem. Every time they touched a piece of code in the legacy system, they added unit tests and refactored the code as necessary. Gradually, the legacy system became more stable and was able to withstand major refactoring when it was needed. We experienced the power of unit tests!
Acceptance Tests
The product engineer (the customer proxy) took ownership of creating the acceptance tests. These tests varied in format depending on the actual story. Although he struggled at first, the product engineer got pretty good at giving the tests to the programmers before they started coding. The team created a test template, which evolved over time, that met both the programmers’ and the testers’ needs.
The tests were sometimes informally written, but they included data, required setup if it wasn’t immediately obvious, different variations that were critical to the story, and some examples. The team found that examples helped clarify the expectations for many of the stories.
The test team automated the acceptance tests as soon as possible, usually at the same time as the stories were being developed. Of course, the product engineer was available to answer any questions that came up during development.
These acceptance tests served three purposes. They were business-facing tests that supported development because they were given to the team before coding started. Secondly, they were used by the test team as the basis of automation that fed into the regression suite and provided future ideas for exploratory testing. The third purpose was to confirm that the implementation met the needs of the customer. The product engineer did this solution verification.
See Chapter 8, “Business-Facing Tests that Support the Team,” for more about driving development with acceptance tests.
Automation
Automation involved the functional test structure, web services, and embedded testing.
The Automated Functional Test Structure