An Agile Tool Selection Strategy for Web Testing Tools
By: Lisa Crispin
Selecting a test automation tool has always been a daunting task. Let’s face it, just the thought of automating tests can be daunting! The selection of tools available today, especially open source tools, is positively dazzling. In the past several years, "test-infected" developers, not finding what they need in the vendor tool selections, have created their own tools. Fortunately for the rest of us, many are generous enough to share them as open source. Between open source tools and commercial tools, we have an amazing variety from which to choose.
To avoid that deer-in-the-headlights feeling, consider taking an ‘agile’ approach to selecting web testing tools. Plan an automation strategy before you consider the possible tool solutions. Start simple, and make changes based on your evolving situation. Here are some ideas based on experiences I’ve had with different agile (and not so agile!) development teams. Even if your team doesn’t use agile development practices, you’ll get some useful tips.”
An Agile Test Automation Strategy
First of all, your team should consider your testing approach. When I say ‘team’, I’m thinking of everyone involved in developing and delivering the software, which in your case might be a virtual team. When do you write tests? Who writes them? How should the test results be delivered? Who needs to be able to look at the test results, and what should they be able to learn from them? What kind of tests need to be automated, and when? Do you have other tedious tasks, such as populating test data or looking through version control system output, that you’d love to automate?
Back in 2003, my current team had no test automation at all, and a buggy legacy web-based J2EE application. We desperately needed to automate our regression tests, since the manual regression tests took the whole team a couple of days to complete, and we were delivering new code to production every two weeks. We had decided to start rewriting the system, developing new features in a new architecture, while maintaining the old code, but this would be impossible without a safety net of tests.
We committed to using to test-driven development for a number of reasons, one being that automated unit tests have the highest return on investment of any automated test. We went a step further, and decided to also use ‘customer-facing’ tests and examples to help drive development. We’ve found that one example is worth pages of narrative requirements! We wanted to be able to write high-level, big-picture test cases before development starts, and then write detailed executable test cases concurrent with development so that when coding is finished, all the tests are passing.
Meanwhile, we required some kind of ‘smoke test’ regression suite for the legacy application, to make sure that critical parts kept working. Due to the old code’s architecture, we decided these would have to be done through the GUI. We wanted all of our tests to run during our continuous build process, which was automated using CruiseControl, so we’d have quick feedback of any regression failures.
Quick and easy-to-read notification of whether tests passed or failed was important to us. Ideally, our build would include these results in an email. In the event of a failure, we wanted to be able to quickly drill down to see the cause.
Platform is an obvious consideration. Our build runs on Linux, and our application was running on Linux, Solaris and Windows at the time. Any test tools that, for example, only ran on Windows did not have much appeal.
Based on all these needs, we started searching for tools. Our whole team takes responsibility for quality and testing, so we all needed to agree on our automation approach and tools. Having programmers, testers, database specialists and system administrators collaborate on test automation leverages a variety of skills to help get the best solutions. I highly recommend taking a ‘whole team’ approach to deciding on a test automation strategy, choosing and implementing tools.
An Agile Tool Selection Strategy
The whole team approach means asking ourselves, "What skills do we have on our team?" Do any team members have extensive experience with particular test tools or types of test tools? What programming and scripting language competencies exist on the team? How much technical expertise do the testers have? How about the business people who might be reviewing or even helping to write tests? What types of tests are you automating? Unit, integration, functional, security, or do you need to do performance or load testing? How robust do your test scripts need to be, and how much can you spend on maintenance? Are you planning to do data-driven or action keyword type tests where the tests accept a variety of input parameters and have a lot of flexibility? Or are you looking for straightforward, low-maintenance tests? Can you test at a layer below the user interface, or do you have an architecture that makes that difficult? These are all considerations when shopping for a test tool.
With a variety of test needs, consider that you may need a variety of tools. We tried to keep an open mind on what might solve a particular automation problem, and we were willing to experiment. We’d pick a tool to try for a few iterations and see how we liked it. Getting up to speed on tools to the point where you can effectively evaluate them takes time, so be sure to budget plenty of time in your planning.
... to read more articles, visit http://sqa.fyicenter.com/art/