Performance testing is widely misunderstood, particularly by executives and managers; these misunderstandings can cause a variety of difficulties, including project failure. Experienced performance testers are a necessity to any testing venture: they will speak your language and guide you through the process of meeting your goals, even if you’re as yet unable to articulate what those goals are. They not only know how to relate to executives in terms of business risks, short- and long-term costs, and implications of schedule adjustment, but they also know how to explain their testing. It is the job of the performance tester to define the conditions under which the goal applies; i.e. to determine the goal's context. Therefore, this goal must address such things as client connection speed, number of people accessing the site, and the types of activities those people are performing.
By: Kelvin Kam
Performance testing is widely misunderstood, particularly by executives and managers; these misunderstandings can cause a variety of difficulties, including project failure. Experienced performance testers are a necessity to any testing venture: they will speak your language and guide you through the process of meeting your goals, even if you’re as yet unable to articulate what those goals are. They not only know how to relate to executives in terms of business risks, short- and long-term costs, and implications of schedule adjustment, but they also know how to explain their testing. For example: “Each page will display in under x seconds, 100 percent of the time.” While this is both quantifiable and testable, it has no meaning on its own. It is the job of the performance tester to define the conditions under which the goal applies; i.e. to determine the goal’s context. Therefore, this goal must address such things as client connection speed, number of people accessing the site, and the types of activities those people are performing. These are all variables that affect page response time. A more complete goal would be:
Under a 1000-hourly user load, static pages will display in under x seconds, dynamic pages in under y seconds, and search/order pages in under z seconds, 98 percent of the time with no errors when accessed via corporate LAN.
Experienced performance testers also know how to collect and present data in ways that show whether the system is missing or meeting goals, in what areas, and by how much, without requiring the viewer of this data to have any special technical knowledge of the system under test.
An experienced performance tester will be able to help the executive to convert their feelings about performance into goals and project plans. They will also assist team members in understanding performance testing so they can make sound, informed decisions. There are several important decisions to make about an application during the development lifecycle based on three fundamental questions:
- Does it meet the need/specification for which it was developed?
- Will the application function adequately for the users of the system?
- Will the user of the system be frustrated by using it?
The experienced performance tester knows the importance of these questions – and their answers – and will work with you (literally by your side at times) to help you answer the questions in terms of performance.
First of all, you must set the expectation early that the performance tester will interact with you, and that their job is to provide you with the information you need to make decisions about
performance issues and risks. Always make sure these contain enough contexts to provide meaning for executive-level decision making.
Review the performance test plan and deliverables and ask yourself the following questions:
- Will this assist with “go-live” decisions?
- Is it likely that the results from this plan could lead to a better experience for the end-user?
- How likely is this to be representative of the actual production environment?
- Is this likely to be useful to developers if tuning is necessary?
- Will it provide an answer to each specific requirement, goal, or agreement?
Finally, invite the performance tester to educate you along the way. In helping you to expand your knowledge about performance testing, the tester will gain a wealth of knowledge about what is most important to you in terms of performance. Mutual understanding and open communication are the best things that can happen to the overall performance of a system.
There is a common perception that performance testing can’t effectively start until the application is stable and mostly functional – meaning that performance test data won’t be available until significantly into a beta or qualification release. This leaves virtually no time to react if (or more realistically, when) the results show that the application isn’t performing up to expectations.
Actually, an experienced performance tester can accomplish a large number of tasks and generate a significant amount of useful data even before the first release to the functional testing team. He can create test data and can gather these kinds of statistics:
- Network and/or Web server throughput limits
- Individual server resource utilization under various loads
- Load balancing overhead / capacity / effectiveness
- Speed and resource cost of security measures
Some developers and system architects argue that the majority of these tasks belong to them, but developers rarely have the ability to generate the types of load needed to complete the tasks effectively. Adding the performance tester to these tasks early on will minimize the number of surprises and provide foundational data that will greatly speed up the process of finding root causes and fixing performance issues detected late in the project lifecycle.
Plan and assign a performance tester to the project from kick-off to roll out. Encourage the development team to use the tester’s skills and resources as a development tool, not just as a post-development validation tool. It is worth noting that, because of the skill set, this individual can be fully utilized as a supplemental member of virtually every project team. There is one caveat: Make it clear that performance testing is this person’s primary responsibility, definitely not an additional duty. This distinction is critical because “crunch time” for performance testing typically coincides with “crunch time” for most of the other teams with which the performance tester may be working.
Delivery ≠ Done
“Delivery” is a decision based on risks and should not be confused with “done.” The system will almost certainly be deployed when management thinks holding up the release is riskier than releasing it, even if that means releasing it with unresolved or untested performance issues. However, releasing the software is no reason to stop performance testing.
Most applications have a rollout plan or an adoption rate that ensures that the peak supported load won’t occur for a significant period of time after the go-live day. That is the prime time to continue performance testing. Some of the benefits of this are fewer distractions, the existence of actual live usage data rather than predictions or estimations, the ability to observe performance on actual production hardware, and often the availability of more resources.
It is important that performance testing continue even after the initial release; you should plan to push maintenance releases with performance enhancements prior to the first expected load peak. Incorporating these actions into the project plan from the beginning allows you to release software when performance is deemed acceptable for early adopters rather than holding up releases until the performance is tuned for a larger future load.
Skills and Experience
A top-notch performance testing candidate should be able to answer many to most of the questions that an interviewer would ask of a mid-level developer/DBA/systems administrator. Questions such as, “What was the most complex custom function you ever had to code to enable your load generation script to accurately represent the expected usage scenario? What made the function complex? Do you still have the code and/or could you re-create it easily?”
Performance testing requires a unique skill set that is above and beyond what is generally expected for members of a functional testing team and in many cases more broad than what is generally expected of development team members. Experience and skills include:
- User community modeling and simulation
- Communication protocol of system under test (i.e., HTTP/S for Web applications)
- Networking/Network architecture (i.e., MCSE for Microsoft networks)
- Statistics (for meaningful presentation and interpretation of data)
- Graphical presentation of complex information
- Application exploration
- Reliability, stability, and security testing
- System monitoring/administration (i.e., memory usage, disk I/O, etc.)
- Business analysis
- Human factors/usability studies
- Programming languages (specifically the languages of the load generation tool and the application under test)
- Database design/administration
These are the skills that could be considered for a senior performance tester over a range of software projects.
Tool-Driven Test Design
The most common mistake made by managers and testers alike is what people refer to as “tool-driven test design.” Having agreed that effective performance testing requires a load generation/simulation tool, we need to beware of building tests around that which is either easy to do using the tool or that which the vendor claims the tool does well. All of the tools on the market – including open source tools – were developed either to test a wide variety of applications or to test one specific application. Either way, they were not developed to test your application.
It is very important to either hire the performance tester first and then provide that person with the tool he recommends for the job, or select a tool then hire a performance tester with significant experience and success using that tool.
To avoid the “tool-driven test design” trap, team leaders must listen and observe. Even the best performance testers can develop blind spots when it comes to their favorite tool. If you hear things such as “but the tool can’t do X” or “Y will take too long with the tool,” remind the team of the initial instructions and make a note to discuss acceptable alternatives at a later time.
The Finish Line
From the start, executives and managers set the pace for the performance testing component of their projects. This responsibility carries with it a need to be educated about performance testing, what it is, what it isn’t, and what constitutes reasonable expectations.
1 – Wikipedia – http://en.wikipedia.org/wiki/Software_performance_testing
2 – Usability Net – http://www.usabilitynet.org/tools/testing.htm
3 – Performance Testing Guidance for Web Applications – http://msdn.microsoft.com/en-us/library/bb924375.aspx