Selecting a test automation tool has always been a daunting task. Let’s face it, just the thought of automating tests can be daunting! The selection of tools available today, especially open source tools, is positively dazzling. In the past several years, “test-infected” developers, not finding what they need in the vendor tool selections, have created their own tools. Fortunately for the rest of us, many are generous enough to share them as open source. Between open source tools and commercial tools, we have an amazing variety from which to choose.
To avoid that deer-in-the-headlights feeling, consider taking an ‘agile’ approach to selecting web testing tools. Plan an automation strategy before you consider the possible tool solutions. Start simple, and make changes based on your evolving situation. Here are some ideas based on experiences I’ve had with different agile (and not so agile!) development teams. Even if your team doesn’t use agile development practices, you’ll get some useful tips.
Author: Lisa Crispin, http://lisa.crispin.home.att.net/
An Agile Test Automation Strategy
First of all, your team should consider your testing approach. When I say ‘team’, I’m thinking of everyone involved in developing and delivering the software, which in your case might be a virtual team. When do you write tests? Who writes them? How should the test results be delivered? Who needs to be able to look at the test results, and what should they be able to learn from them? What kind of tests need to be automated, and when? Do you have other tedious tasks, such as populating test data or looking through version control system output, that you’d love to automate?
Back in 2003, my current team had no test automation at all, and a buggy legacy web-based J2EE application. We desperately needed to automate our regression tests, since the manual regression tests took the whole team a couple of days to complete, and we were delivering new code to production every two weeks. We had decided to start rewriting the system, developing new features in a new architecture, while maintaining the old code, but this would be impossible without a safety net of tests.
We committed to using to test-driven development for a number of reasons, one being that automated unit tests have the highest return on investment of any automated test. We went a step further, and decided to also use ‘customer-facing’ tests and examples to help drive development. We’ve found that one example is worth pages of narrative requirements! We wanted to be able to write high-level, big-picture test cases before development starts, and then write detailed executable test cases concurrent with development so that when coding is finished, all the tests are passing.
Meanwhile, we required some kind of ‘smoke test’ regression suite for the legacy application, to make sure that critical parts kept working. Due to the old code’s architecture, we decided these would have to be done through the GUI. We wanted all of our tests to run during our continuous build process, which was automated using CruiseControl, so we’d have quick feedback of any regression failures.
Quick and easy-to-read notification of whether tests passed or failed was important to us. Ideally, our build would include these results in an email. In the event of a failure, we wanted to be able to quickly drill down to see the cause.
Platform is an obvious consideration. Our build runs on Linux, and our application was running on Linux, Solaris and Windows at the time. Any test tools that, for example, only ran on Windows did not have much appeal.
Based on all these needs, we started searching for tools. Our whole team takes responsibility for quality and testing, so we all needed to agree on our automation approach and tools. Having programmers, testers, database specialists and system administrators collaborate on test automation leverages a variety of skills to help get the best solutions. I highly recommend taking a ‘whole team’ approach to deciding on a test automation strategy, choosing and implementing tools.