Successful Tool Implementation
What have our test tools done for our web application development? Just a few months after adopting JUnit, Canoo WebTest and FitNesse, we had a suite of regression tests that gave us a useful safety net, and our defect rate was going down dramatically. Today, three years later, we have increased our test numbers by a factor of ten or more. Our regression suites, running many times per day in two continuous builds, catch bugs at least a few times a week. Our rate of defects introduced during development is down by half. We have time and tools to do robust exploratory testing and load testing. Most importantly, but harder to measure, using our tests to drive development has resulted in features that delighted our customers.
My team put plenty of time into the research and tool adoption I’ve described here. Test automation is a big investment, but carefully done, it returns many times what you put into it. You need enough resources to first define what you need, then investigate your options, then to try out tools. Our team takes two weeks, twice a year, to devote to tasks such as researching new tools, and refactoring tests and code. This may seem like a luxury, but our management knows it helps us keep our technical debt to a minimum, so that we can improve our future productivity.
Depending what people have which skill sets, it may pay to pair people up to research or try out a test tool. A programmer and a tester could team up to try out a scripting language. A system administrator might help determine if a tool can be integrated into the team’s build process. Have brown bag sessions to brainstorm ideas, or start a book club to get ideas from publications.
Remember the ‘whole team’ approach. The team should come to a consensus on what tools to build, to try or to adopt. If a tool isn’t producing expected benefits, the team should decide whether to try a new approach or a different tool. More experienced members of the team can coach their less experienced coworkers to help everyone get up to speed on using the new tool. You may need to involve experts from outside your team to help you succeed with the new tool. People outside your team who need to use the tool, for example, to specify tests themselves, will need your help.
What If Your Team Isn’t Interested?!
“This sounds very nice,” you say, “but my team is so overloaded and busy, nobody has time to think about tools, and they don’t think it’s important to automate tests.” Not everyone has like-thinking team members. Or maybe you’re a tester on an isolated QA team that isn’t getting much support from programmers or other groups.
Don’t despair, just get creative. I once worked in a chaotic company developing a retail internet application. Despite encouragement from the development manager, the programmers automated few unit tests. The company owned licenses for a vendor GUI test tool. We hired a tester with expertise in that tool, who could automate some regression tests and teach others how to use the tool. But the tool couldn’t address all our automation needs.
The web application was written in TCL, so there were several TCL developers. On various mailing lists, I ran across several people using TCL effectively for test automation. I decided to teach myself enough TCL to create test scripts for areas we couldn’t cover with the vendor tool. While the developers weren’t interested in test automation, they were happy to help me with my TCL coding problems. This shows how it pays to leverage the expertise around you. Although far from an ideal situation, we automated enough regression tests to free up our time for useful exploratory testing.
People, not tools, make projects successful. It doesn’t matter what tools you use or develop, as long as they help you towards your goals. Collaborating to choose and use the right test tools for your team’s situation helps allow all team members to do their best work. That’s the bottom line for test automation.
12. _Everyday Scripting with Ruby_, Brian Marick, Pragmatic Bookshelf, 2007
Originally published in the Summer 2007 issue of Methods & Tools