Of course, the type of testing you’re automating is key. Security testing probably needs highly specialized tools. There are many existing open source and vendor tools for performance, so the job of selecting one isn’t overwhelming. As you master one challenge, you’ll be better prepared for the next. It took Lisa’s team a couple of years to develop robust regression test suites at the unit, integration, and functional levels. Performance testing was their next area of pain. Lessons learned from the earlier automation efforts helped them do a better job of identifying requirements for a test tool, such as ease of reporting results, compatibility with existing frameworks, and scripting language.
Write a checklist that captures all your tool requirements. Some of them might conflict with or contradict each other—“The tool needs to be easy enough so that customers can specify tests” or “The tests should be easy to automate.” Write them down so you can find the right balance. Then start doing your research.
One Tool at a Time
You’re going to need different tools to serve different purposes. Implementing new tools and learning the best way to use them can get overwhelming pretty quickly. Try one tool at a time, addressing your greatest area of pain. Give it enough time for a fair trial and evaluate the results. If it’s working for you, master that tool before you go on to the next area of pain and the next tool. Multitasking might work for some situations, but new technology demands full attention.
When you’ve settled on a tool to address a particular need, take a step back and see what else you need. What’s the next automation challenge facing your team? Will the tool you just selected for another purpose work for that need, too, or do you need to start a new selection process?
If you’ve decided to look outside your own organization for tools, the first step is to find time to try some out. Start with some basic research: Internet searches, articles and other publications about tools, and mailing lists are good places to get ideas. Compile a list of tools to consider. If your team uses a wiki or online forum tool, post information about tools and start a discussion about pros and cons.
The bibliography contains websites that help with tool searches and evaluation.
Budget time for evaluating tools. Some teams have an “engineering sprint” or “refactoring iteration” every few months where, rather than delivering stories prioritized by the business, they get to work on reducing technical debt, upgrading tool versions, and trying out new tools. If your team doesn’t have these yet, make a case to your management to get them. Reducing your technical debt and establishing a good testing infrastructure will improve your velocity in the future and free time for exploratory testing. If you never have time to make code easier to maintain or upgrade tools, technical debt will drag down your velocity until it comes to a halt.
When you have a list of tools that may meet your requirements, narrow the possibilities down to one or two, learn how to use each one well enough to try it, and do a spike: Try a simple but representative scenario that you can throw away. Evaluate the results against the requirements. Use retrospectives to consider pros and cons.
What resources do you need to implement and use the tool? What impact will the tool have on the team’s productivity and velocity? What risks does it pose? What will it allow you to do in the long term that you can’t do now?
Pick your top candidate and commit to trying it for some period of time—long enough to get some competency with it. Make sure you try all your mission-critical functionality. For example, if your application uses a lot of Ajax, make sure you can automate tests using the tool. In retrospectives, look at what worked and what didn’t. Be open to the idea that it might not be right and that you have to throw it out and start over. Don’t feel you have to keep on with the tool because you have so much invested in it already.
We all know that there’s no “silver bullet” that can solve all your automation problems. Lower your expectations and open your mind. Creative solutions rely on art as much as science.
Lisa’s Story
When conducting the performance test tool search, we turned to an agile testing mailing list for suggestions. Many people offered their experiences, and some even offered to help learning and implementing a tool. We searched for a tool that used Java for scripting, had a minimal learning curve, and presented results in a useful graphical format. We listed tools and their pros and cons on the team wiki. We budgeted time for trial runs. Lisa’s coworker, Mike Busse, tried the top two candidates and showed highlights to the rest of the team. A tool was chosen by team consensus and has proven to be a good fit.
—Lisa