This post goes out to those that approached me recently with the question "I would like to learn performance testing. How do I get started?"
It's a step by step approach to getting started on performance testing.
A couple of points may cause deja-vu from previous posts, but repeating those was necessary evil towards creating this summary guide
Step 1 : Identify tests !
Identify the performance test needs of the application under test (AUT). The AUT could be a web application, a guest OS, host OS, API or just a client server application.
Hints to identify crucial components:
1. Log user behavior in production to measure the most concurrently and frequently used features
2. Identify brittle system components that may not scale well.
Dive deep into the system architecture diagrams to figure out these nuances. An example to explain this would be files or database tables that hold ever increasing information
Remember, throughput is just a unit of transactions/time, and is not restricted to requests/sec. Your client server application may have a file upload feature. Your throughput could be the size of the file that's uploaded single or multiple.
Step 2: Match environments !
If your QA env is one that's rebuilt frequently, it would be wise to automate this comparison
Hints to compare key differentials:
1.System resources on the server in production like CPU, Memory, Hard disk, Swap.. you get my drift.
2.System specifics like log level correlation, any other processes running on the server other than the application under test
3.Network and system architecture comparison, including load balancer settings, if implemented for any server
4.If production system has redundancy in servers, QA env should mimic atleast to scale, if not the same count of redundancy
Step 3: Plan !
Now that you've identified what to test, write a detailed plan with what features you intend to load/stress test, what features you intend to measure performance of, the min throughput you will apply etc. An earlier post explains this in detail.
Step 4: Automate !
Identify your automation needs based on the performance test plan.
If it's a web application where user behavior can be completely tracked via HTTP requests, use a tool like JMeter (refer to my earlier post on performance test tools for details)
To get started on JMeter, read this section of their user manual.
If all the user actions on the web application cannot be tracked by HTTP requests (client side AJAX UI behavior), you may want to use UI automation tools that let you customize the code to run it multi-threaded. (hint: Selenium IDE and RC/Server)
If you are trying to increase concurrent calls to specific system methods/features, scripting multi threaded behavior using the application's API calls would be a good way of also implicitely load testing API calls.
Step 5: Execute !
Key points to remember while executing performance tests:
1.Isolate the QA env where performance tests are run. Functional tests or any other concurrent use of this system will skew your results
2.Restore system state after stress tests. The automated stress tests should include a step to restore system to default state. Ensure that this step is run regardless of whether your tests completed or crashed. Ofcourse, it would be prudent to check the state before starting the tests.
Step 6: Analyze !
I am writing up detailed posts on this topic, meanwhile an earlier post covers some of the basics as well .
Result analysis involves tracking the different values being measured and monitored and checking if they satisfy the verification/pass criteria
In case of inconsistencies, isolate the test/behavior that causes failures.
Identify the cause of the bottlenecks with profilers, stepping through code etc.
Resolve bottlenecks by tuning your settings, adding more memory, load balancing servers, indexing database tables, improving slow running queries etc.
Step 7: Report !
Performance test results are best reported in two formats. One is a basic summary that highlights what the stakeholders care to know or should know. And a detailed report for archival and comparison with regression runs.
Stakeholder summary could contain max concurrency/load/throughput and average response time for best and worst concurrency cases
Detailed report could contain the resource monitoring graphs, output from the file handle monitoring scripts and logs created by the automation tool.
Subscribe to:
Post Comments (Atom)
Hi Dharma,
ReplyDeleteI just started a QA Lead search for a well-funded, fast growing startup in Redwood City.
My client just completed a $15M series C round. The company is in ramp-mode, and QA is a mission-critical area of the business. It could be a great equity opportunity for you.
I have many more details, of course, but the first question is: are you open to exploring start-ups. If so, please call me.
Matt
Matthew Loera
President, Celerity
(415) 508-8361 work
(415) 410-5790 cell
matt@celeritystaff.com
Awesome article, it was exceptionally helpful! I simply began in this and I'm becoming more acquainted with it better! Cheers, keep doing awesome!
ReplyDeleteSoftware Testing Services
Software Testing Company
Software Testing Companies in USA
QA Testing Companies
Software Testing Services in USA