Software QA FYI - SQAFYI

Testing in the Fast Lane:Automating Acceptance Testing in an Extreme

By: Lisa Crispin

Abstract
In eXtreme Programming Explained [1], Kent Beck compares eXtreme Programming (XP) to driving a car: the driver needs to steer and make constant corrections to stay on the road. If the customer is steering the car, the XP tester is navigating. Someone needs to plot the course, establish the landmarks, keep track of the progress, and perhaps even ask for directions. This is all complicated by the need for speed. XP testers have to drive in the fast lane. To be the windshield and not the bug, you need a lightweight automated test design and lightweight test tools.

This paper covers experiences gained with acceptance test automation during a 10-month period at a Java outsourcing company using XP for all software development. A team of up to nine developers and one tester completed a halfdozen projects during this time.

This paper covers:
• Why you should automate acceptance tests
• How we designed automated tests that are low-maintenance and self-verifying
• How we coded and implemented automated tests quickly and fearlessly, illustrating the techniques with an example
• Selecting or developing tools to assist in test automation
• How we applied the values of XP to test automation

The lightweight test design discussed here could be implemented with any automated test tool that permits modularized scripts. It allowed our acceptance testing to keep pace with the rapid iterations of XP.

Introduction
The three XP books give detailed explanations of many aspects of the development side of XP. The test engineer coming from a traditional software development environment may not find enough direction on how to effectively automate acceptance tests while keeping up with the fast pace of an XP project. In an XP team, developers are also likely to find themselves automating acceptance tests – an area where they may have little experience. Automating acceptance testing in an XP project may feel like driving down a 12% grade in a VW bug with a speeding semi in the rear-view mirror. Don’t worry – like all of XP, it requires courage, but it can – and should – be fun, not scary. Test automation requires programming. When developing automated test scripts, we benefited from following XP practices: simple design, refactoring, pairing, coding standards. We’ll explore some test automation designs, principles and practices that helped us complete our XP projects successfully. Our experiences may help you accelerate smoothly and safely into the XP fast lane.

Automating Acceptance Tests

Why Automate Acceptance Tests?
Because you can’t afford not to! Although automation of unit tests is a given in XP, the case for automation of Acceptance test has not been made as strongly. The inherent “messiness” of acceptance tests and the difficulty in automating at the user level are probably the main reason for this.

One of the reasons acceptance testing is messy is because the expected pass rate is not 100% until at the very end of the iteration, when all the stories have been implemented. Prior to that, it is necessary to judge your progress based on partial pass rates.

So you have to do additional work after running acceptance tests to decide if the partial score is better or worse than the previous score. And if you do determine that there is a problem, then the fact that a single acceptance test case often relies on many different objects means that when it fails, which code to fix is not obvious.

Unfortunately, these problems is to can cause an XP team to leave acceptance testing to the customer, where it becomes a tedious, expensive, and non-repeatable exercise that is only rarely performed, perhaps only at the very end of an iteration.

But these difficulties are in fact exactly the reason why automation of acceptance testing is so important. If done “right”, the investment in automation buys you the extra time and mind space to deal effectively with these ambiguities and the judgments you must make to resolve them. It provides advanced warning so that problems not disclosed by the unit tests can be investigated and resolved in the 40 hour week, instead in a mad dash at the end of the iteration. It also gives you the time to carefully plan and target tests that cannot be practically automated, in order to minimize the time and maximize the impact of any manual testing. And finally, since you can run them so much more frequently if they are automated, then the chances that they will pass when the customer is looking are much higher.

Full article...


Other Resource

... to read more articles, visit http://sqa.fyicenter.com/art/

Testing in the Fast Lane:Automating Acceptance Testing in an Extreme