Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Performance Testing

Roobini Krishnandam
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
February 9, 2019

Performance Tests which should to be the part of performance test strategies - 

1. Shakedown Test -

Performance test scripts once created should be executed in isolation as well as together with a small number of virtual users (in general 10-15) with the objective of

  • Proving that multiple virtual users can be run at the same time.
  • Flush out any obvious issues that may be present.
  • Provide quick feedback to the project with regards to the capability of the system.
  • Check the monitoring solution in place and its reporting stats.

2. Peak Load Test -

Generate load representative of an exceptionally busy day (busiest day of the year).

Verifies that the system will meet NFR's and remain stable while under a typical peak load representative of a busy day. Generally the peak load test will be executed over a one hour period after all virtual users have been initialized and are running.

Monitoring would focus on the response times of the transactions, CPU & memory stats, Message Bus queuing, server load average, exceptions if any in the application logs during the test. DB can be cloned from Production, masking the confidential data so that the similar size can be achieved. DBA assistance would be needed in this task.

3. Endurance Test -

The Endurance test will be executed at Peak Load capacity for 12 hours overnight or even over the weekend for 2.5 days. Verifies that the system will meet NFRs and remain stable while under a typical the load for the extended period of time. Monitoring will focus on system resource usage (memory leaks, unreleased socket and database connections). Front end response times along with the back-end statistics like CPU, memory etc would be the part of the results of this performance test. DB can be cloned from Production, masking the confidential data so that the similar size can be achieved. DBA's assistance would be needed in this task.

4. Incremental Load Test with increased size of DB -

The Peak load test workload would be executed again in this test but the only difference is this time the DB would be populated with some large amount of records. The purpose of this test would be to find the impact of increase in size of the DB on the Front end GUI response times and at the back-end statistics. DB can be cloned from Production, masking the confidential data so that the similar size can be achieved. However to add more data, new data can be prepared or added or the same Production data can be refreshed again. DBA's assistance would be needed in this task. If Peak load test was done at x size of DB data then to start with 1.5 x could be tested.

5. Breakpoint/Scalability test -

The Purpose of this test is to find out the breakpoint of the system. In this testcase we would increase the workload by 20% in every 30 minutes and would find out the point where the application becomes unstable. In this test, monitoring tool would play an important role in identifying the point where the system stats are degraded or application becomes unstable / unresponsive. 

6. Network Capacity test -

The Purpose of this test is to find out the application behavior under 3G network. Necessary changes have to be done at the Performance testing tool side to do this test. The response times noted at the too side for all the transaction points would be the part of the performance report. In Jmeter, it can be achieved with throttling the network bandwidth in Jmeter.properties or user.properties file whereas in Load runner it can be done under the Runtime settings option.

7. Browser Compatibility test -

This is the client-side test. Normally we do performance testing at the server-side and share the results which are often with 4G network bandwidth. But in real-time end-users complains that the application is slow as the application might behave slowly when it comes to different browsers like Chrome, Firefox and Internet explorer. In this test, we test the response side at the browser level (Chrome, FF or IE) and report the same.

In Jmeter, we can integrate the same with webdriver sampler available with Chrome, FF and IE driver and in Load runner we can achieve the same via TruClient protocol. This test is just the indicative of the end-user experience so it may be executed with lesser load if there is a hardware constraint.

1 comment

Comment

Log in or Sign up to comment
Greg Elofson May 21, 2019

Are you using Gatling.io for any of these tests? If so, would you have any sample scripts you can share?

TAGS
AUG Leaders

Atlassian Community Events