Software QA FYI - SQAFYI

High-Performance Testing

By: Scott Barber

Introduction
As an activity, performance testing is widely misunderstood, particularly by executives and managers. This misunderstanding can cause a variety of difficulties-including outright project failure. This article details the topics that I find myself teaching executives and managers time and time again. Learning, understanding, and applying this knowledge on your performance testing projects will put you on the fast track to success.

Insist on Experience
Experienced performance testers will speak your language and guide you through the process of meeting your goals, even if you can't yet verbalize those goals. Experienced performance testers not only know how to relate to executives in terms of business risks, short- and long-term costs, and implications of schedule adjustment but they also know how to explain their trade without all the jargon and techno-babble. Experienced performance testers are used to explaining the relevance of the latest "performance buzz" to your system. They have spent years learning how to extract performance goals from such words as "fast," "maximum throughput," and "xxxx concurrent users" - none of which has meaning in isolation. Consider this example: An executive dictates that, "Each page will display in under x seconds, 100 percent of the time." While this is both quantifiable and testable, it has no meaning on its own. It is the job of the performance tester to define the conditions under which the goal applies, in other words, to determine the goal's context. To have meaning, this goal must address such things as client connection speed, number of people accessing the site, and types of activities those people are performing. These are all variables that affect page response time. A more complete goal would take this form:

Using the "peak order" workload model, under a 500-hourly user load, static pages will display in under x seconds, dynamic pages in under y seconds, and search/order pages in under z seconds, 95 percent of the time with no errors when accessed via corporate LAN.

Experienced performance testers also know how to collect and present data in ways that show whether the system is missing or meeting goals, in what areas, and by how much, without requiring the viewer of this data to have any special technical knowledge of the system under test.

Notice that I use the term "goal" instead of "requirement" when speaking about performance. I do this because I have never been involved in a performance testing project that delayed or canceled a release due to performance test results not meeting the stated criteria. I also choose the term "goal" because virtually no one expects the performance to be as good as he wants prior to Release One. What people hope for is "good enough for now." There is an assumption that performance will be improved during testing, that the production environment will resolve the performance issues detected in the test environment, and/or that adoption will be gradual enough to deal with performance problems as they arise in production. An experienced performance tester will be able to help you convert your feelings about performance into goals and project plans. Above all, performance testers want you, the executive, to understand performance testing so you can make sound, informed decisions. As an executive, you have several important decisions to make about an application during the development lifecycle. Most of the decisions center around three fundamental questions:

* Does it meet the need/specification for which it was developed?
* Will the application function adequately for the users of the system?
* Will the user of the system be frustrated by using it?

The experienced performance tester knows the importance of these questions - and their answers - and will work with you (literally by your side at times) to help you answer the questions in terms of performance.

First and foremost, you must make it known that you expect experienced performance testers on your projects, not "fools with tools" as some folks refer to them. Set the expectation early that the performance tester is expected to interact with you, and that his job is to provide you with the information you need to make sound business decisions about performance issues and risks. Always make a point to personally review performance goals to make sure they contain enough context to make them meaningful for executive-level decision making.

Review the performance test plan and deliverables and ask yourself the following questions:
* Will this assist with "go-live" decisions?
* Is it likely that the results from this plan could lead to a better experience for the end-user?
* Is this likely to be representative of the actual production environment?
* Is this likely to be useful to developers if tuning is necessary?
* Will it provide an answer to each specific requirement, goal, or service level agreement?
* Is taking action based on the results part of the plan?

Finally, invite the performance tester to educate you along the way. In helping you to expand your knowledge about performance testing, the tester will gain a wealth of knowledge about what is most important to you in terms of performance. This mutual understanding and open communication are the best things that can happen to the overall performance of a system.

Begin Performance Testing Before the Application Is Fully Functional
There is a common perception that performance testing can't effectively start until the application is stable and mostly functional - meaning that performance test data won't be available until significantly into a beta or qualification release. This leaves virtually no time to react if, or more realistically when, the results show that the application isn't performing up to expectations.

In actuality, an experienced performance tester can accomplish a large number of tasks and generate a significant amount of useful data even before the first release to the functional testing team. He can create and gain approval of User Community Models and test data and can gather these kinds of statistics:
* Network and/or Web server throughput limits
* Individual server resource utilization under various loads
* Search speeds, query optimization, table/row locking, and speed versus capacity measurements for databases
* Load balancing overhead/capacity/ effectiveness
* Speed and resource cost of security measures

Some developers and system architects argue that the majority of these tasks belong to them, but developers rarely have the ability to generate the types of load needed to complete the tasks effectively. Adding the performance tester to these tasks early on will minimize the number of surprises and provide foundational data that will greatly speed up the process of finding root causes and fixing performance issues detected late in the project lifecycle.

This one is pretty obvious. Plan to have a performance tester assigned to the project from kickoff through roll out. Encourage the development team to use the tester's skills and resources as a development tool, not just as a post-development validation tool. It is worth noting that, depending on the project, the performance tester is used for performance-related activities between 50 and 100 percent of the time. The upside is that, because of the skill set noted in the "Skills and Experience" sidebar, this individual can be fully utilized as a supplemental member of virtually every project team. There is one caveat: Make it clear that performance testing is this person's primary responsibility - not an additional duty. This distinction is critical because "crunch time" for performance testing typically coincides with "crunch time" for most of the other teams with which the performance tester may be working.

Don't Confuse Delivery with Done
"Delivery" is an informed decision based on risks and should not be confused with "done." Anyone who has been around testing for a while knows that the system will be deployed when management thinks holding up the release is riskier than releasing it, even if that means releasing it with unresolved or untested performance issues. However, releasing the software is no reason to stop performance testing.

Most applications have a rollout plan or an adoption rate that ensures that the peak supported load won't occur for a significant period of time after the go-live day. That is prime time to continue performance testing. There are fewer distractions, the existence of actual live usage data rather than predictions or estimations, the ability to observe performance on actual production hardware, and often the availability of more resources. If there isn't a maintenance release scheduled soon enough to get the post-release fixes into production before usage reaches the performance limit, surely it's more cost effective to schedule one than to contend with a performance issue when it presents itself in production.

Plan to continue performance testing after the initial release. Plan to push maintenance releases with performance enhancements prior to the first expected load peak. Incorporating these plans into the project plan from the beginning allows you to release software when performance is deemed acceptable for early adopters rather than holding up releases until the performance is tuned for a larger future load.

Look for Skills and Experience
Quality performance testers are senior members of the project team in terms of depth and breadth of skills and experience. You probably have read similar statements before, but I assure you that this isn't "the same ol' message in a new suit." When asked, "What skills should I look for in a performance testing candidate?" I reply, "What you want is a mid-level everything." The "Skills and Experience" sidebar lists specific skills to look for in a performance tester.

This really comes down to employee/ consultant selection and training on which you likely have significant influence. In my experience, a top-notch performance testing candidate should be able to field many to most of the questions that an interviewer would ask of a mid-level developer/DBA/systems administrator. Obviously, this is not the expectation for most functional testing candidates. For instance, when interviewing a functional test candidate, you may ask questions such as, "How comfortable are you working directly with code?" When interviewing a performance test candidate, it is completely appropriate to ask questions such as, "What was the most complex custom function you ever had to code to enable your load generation script to accurately represent the expected usage scenario? What made the function complex? Do you still have the code and/or could you re-create it easily?"

Full article...


Other Resource

... to read more articles, visit http://sqa.fyicenter.com/art/

High-Performance Testing