Lost Password?
Register now!
Page « 1 2 (3) 4 5 »
Articles : An Agile Tool Selection Strategy for Web Testing Tools
on 2007/10/24 5:50:00 (3545 reads)

Vendor Tools

We desperately needed to get some automated smoke tests going, and thought maybe we could buy a tool that we could run with. As a tester, I have extensive experience with the capture/playback/scripting type of automation tools. Vendor tools are often a safe choice. They’re generally designed for non-technical users, who can get started fairly easily. They come with user manuals and installation instructions. Training and technical support are available for a fee.

Commercial tools are often integrated with a suite of complementary tools, which can be an advantage. Many vendors offer functional, performance and load test tools. They offer impressively robust features. However, they tend to be targeted towards test organizations, and aren’t very ‘programmer-friendly’. They can be difficult to integrate into a continuous build process. They often have proprietary scripting languages, or are limited to one scripting language such as Javascript.

I had previously used Mercury test tools, so QuickTest Pro was one option to consider among many commercial tools. Other vendor tools we could have considered, or could consider if we were looking today, are TestPartner, Rational Functional Tester, SilkTest, BadBoy, TestMaker and (one I have always wanted to try) LISA. We also tried out Seapine’s QA Wizard, since we used their defect tracking tool TestTrack. At that time, both of those capture/playback tools used proprietary scripting languages. We didn’t want to be limited to only capture/replay, as those scripts can be more work to maintain. The programmers on my team didn’t want to have to learn a new tool or new scripting language. That ruled out all the vendor tools we looked at.

Open Source Tools

We turned to open source tools. Since these were generally written by programmers to satisfy their own requirements, they’re likely to be programmer-friendly. But there are issues with open source tools as well. Support, for example. If you have a question about an open source tool, whom do you ask? Most of the open source tools we considered had mailing lists where users and developers shared information and helped each other. I checked each tool’s mailing list to see if there was lots of activity on a daily or weekly basis, hoping to see a large and active user base. I also tried to find out if users’ issues were addressed by the tool’s developer community, and whether new versions were released frequently with enhancements and fixes. Of course, with open source tools you’re free to add your own fixes and enhancements, but we knew we wouldn’t have the bandwidth to do this at first.

Open source tools have a wide range of learning curves. Some assume programming proficiency. This is fine if everyone using the tool is able to achieve that level of competence. Others are geared to less technical users, such as testers and analysts. Some have user documentation on a par (or even better) with good vendor tools, and others leave the learning more up to the user. Some even have bug tracking systems, and the developers actually fix bugs! Think about the level of support and documentation you will need, and find a tool that provides it.  We were looking for a tool that came with a lot of help.

Our GUI Tool Search

One example of a tool we researched, but didn’t try out, was JWebUnit. Since this tool lets you create scripts with Java, it appealed to the programmers on the team. At the time (2003), it didn’t seem to have as many users or as much mailing list activity as other open source test tools. We considered other Java-based tools, such as HtmlUnit, which is widely used. I had used a similar tool, HTTPUnit, before, and had liked it well enough. However, all these Java-based tools were a problem for my severely limited Java coding skills. While we anticipated that the programmers would do a large percentage of automating the customer-facing tests, I needed to write most of the GUI test scripts. I wanted to get a smoke test suite to cover the critical functionality of the legacy system, while the programmers got traction on the unit test side. We needed a programmer-friendly tool, but also a Lisa-friendly tool.

We considered scripting tools such as WATIR and scripting languages such as Ruby. I’d used TCL to write test scripts on a prior team, and I like the flexibility of scripting languages. We didn’t have any Ruby experts on the team, and although I was eager to learn it, the time it would take was an obstacle.

We looked for something that required less OO programming proficiency. I’d heard good things about Selenium, but at the time it had some limiting factor such as being difficult to integrate into our build process. Another tool that would have been a strong contender, but either it wasn’t available yet or I just didn’t know about it, is Jameleon.

After much research, we decided to try Canoo WebTest. This tool, based on HtmlUnit, uses XML to specify tests, and the scripts run via Ant. Since we use CruiseControl for our builds, it was simple to integrate the WebTest scripts with our continuous build. Being used to the features found in commercial tools, WebTest at first seemed a bit simplistic to me. At the time, it didn’t support things like if logic, except by including scripts written in Groovy or other scripting languages. It didn’t look easy to do data-driven tests with WebTest. However, we liked the idea that the tests would be so simple and straightforward, we wouldn’t have to test our test scripts. WebTest seemed a good choice for creating a smoke test suite.

The programmers were comfortable with WebTest, since they were all familiar with XML. If a test failed, they’d be able to understand the script well enough to debug the problem. They could easily update or write new tests. We decided to try it for a few iterations. Since the programmers were busy with learning how to automate unit tests, it was helpful to have a GUI test tool that was easy for me to learn. I implemented it with some help from our system administrator. We soon had two build processes, one running all the unit tests, and the other running the slower WebTest scripts. It took about eight months to complete enough scripts to cover the major functionality of the application. These scripts have caught many regression bugs, and continue to catch them today. The return on our investment has been awesome.

Page « 1 2 (3) 4 5 »
Printer Friendly Page Send this Story to a Friend Create a PDF from the article