Monday, December 14, 2009

Best practices in performance testing

Essential Do's and Dont's in performance testing

Please read my previous detailed posts on Web application performance testing methodology before moving on to this post.

1.Lab resources
It is important to match the architecture of the system between QA lab and production. It could be a scaled down version, in which case the results are analyzed and predicted to scale

How-to match production
  • Hardware specifications of the servers should match
  • Load balanced architecture of the servers should be replicated in lab
  • Database clustering/replication should match, including type of replication set up in production
  • Software versions should match
  • Log level settings and log archival settings should match (otherwise production with a more verbose log level or without sufficient archival will die pretty soon and you can never recreate it in the lab)
  • Innovate ways to simulate hardware or other resources that cannot be matched apples for apples in the lab for budget constraints. Understand the blueprint of the system to determine these vulnerabilities

2.Result archival
The results from each test should be recorded and easily accessible for comparison with future/past results. There is no way to determine improvement or degradation in performance without comparison data

3.Isolated testing
Exercise different actions or transactions in the application separately as atomic actions to isolate bottlenecks easier. ie., don't execute all actions on one or more page(s) sequentially in the same script.

Different concurrent threads may end up executing different actions together and it will not be possible to isolate which action caused the breakdown or bottleneck

There is merits to running tests that exercise a combination of actions together. But this should be one of the scripts/plans of execution to simulate real world scenarios, not to analyze specific component performance.

3.Analyze bottlenecks

Other than the action/transaction separation discussed above, it's also important to exercise different system components separately.
Examples of such isolation methods:

  • Exercising just the HTTP requests for required transaction rates without involving the GUI components. Command line tools like HTTPerf are very useful for these "headless performance tests" as they are called. Ofcourse JMeter can also be used for this, but the time involved in recording, removing image links etc that are not needed and then playing back the bare script is not worth the time in this context.
  • Exercising selected actions (as discussed above)
  • Only invoke requests/ transactions relevant to execute a specific action.. less is more !
  • Exercise database queries separate from application layer. Determine the queries used by the application for each action (where applicable) and run them separately as JDBC transactions from the performance test tool
  • Exercise authentication/LDAP separately by using LDAP transactions. The time it takes to login may have more to do with LDAP than with login code in the application

Related Posts to look forward to (coming soon to a browser near you):
Stress testing
Scalability testing
Performance test tool cheat sheets/Tool instructions

Your comments/suggestions will be much appreciated !

No comments:

Post a Comment