Goal
In this post I will explain how to measure the performance of a web service endpoint with jMeter. We are interested in the behavior of the server using an increasing load. To make things a bit complicated, we will have to log in with each of our test users first and then request the web service endpoint in question.Tools
jMeter
Our main tool is jMeter, which is a Java application designed to load test functional behavior and measure performance. It was originally designed for testing Web Applications but has since expanded to other test functions.download link: http://jmeter.apache.org/download_jmeter.cgi
jMeter plugins
We will use additional components provided in the jMeter Plugins Standard Set. We will also rely on the PerfMon Server Agent to monitor server side metrics during the performance test.download link: http://jmeter-plugins.org/downloads/all/
plugin installation manual: http://jmeter-plugins.org/wiki/PluginInstall/
PerfMon Server Agent manual: http://jmeter-plugins.org/wiki/PerfMonAgent/
Metrics
- request load: We will define the load by number of active users who are making request to the service endpoint. (We can also define load by requests / time, I will talk about this later.) To measure performance on different load levels, we will use a staircase-like load.
- response time: This is the main metric that we measure during the performance test, the time between the request sent and response received. As the server gets more busy, response time will get bigger.
- error percentage: When the server reaches a critical load, it will reply with an error instead of the regular response. This will help us identify the maximum load the server is capable of serving without any errors. (Usually perceived performance is already bad at this point.)
- server side
- CPU usage: This gives us an idea how busy the server is with our requests.
- Memory consumption: This might reveal memory leaks, especially if we are running long performance tests (soak/endurance tests).
Implementation
Test Plan
This is the top level component which contains the Thread groups (scenarios).HTTP Request Defaults
to add: right click on Test Plan >> Add >> Config Element >> HTTP Request Defaults
This defines the default server properties where all requests go to. If you are changing server, change values here.
Thread Groups
This defines the staircase-like load on the server defined by the number of users.
HTTP Cookie Manager (if you want logged in users)
to add: right click on Thread Group >> Add >> Config Element >> HTTP Cookie Manager
This components stores the JSESSIONID cookies during run. You do not have to set up anything, just have it in the Thread Group.
Once Only Controller
to add: right click on Thread Group >> Add >> Logic Controller>> Only Once Controller
We will need this to make sure we perform the login request with each test user only once
HTTP Request
to add: right click on element >> Add >> Sampler >> HTTP Request
This sends the requests for the server. In our scenario, we will need two. One for the login request, create it under the Only Once Controller and another one with the actual service request that we are measuring. Define the path where the requests should go in relative to the already defined Server Defaults. Also you can add parameters if needed (for login request).
Listeners
to add: right click on element >> Listeners >> "name of listener"
These components gather and visualise the results during the performance tests.Aggregate Report
Gathers info about all the requests sent in the Thread Group in a table fashion. Check out Min, Max, Average and Error values.jp@gc Response Times Over Time
jp@gc Active Threads Over Time
This should show the same staircase-like figure as the Thread Group. I have this graph just for the possibility to superimpose it on the Response Times Over Time graph.Response Time Graph
jp@gc Composite Graph
View Results Tree
PerfMon Metrics Collector
By installing the jMeter Server Agent on the backend (docs: http://jmeter-plugins.org/wiki/PerfMonAgent/), you can inspect the CPU and memory consumption of the backend during the performance test. Use PerfMon Metrics Collector listener to gather data. You can also put the data on a composite graph like so:Adding parameters
We might want to run the test with different settings, so it would be nice to parameterize some of them. In this example, we set the host name and port as a parameter.
to add: right click Test plan >> Config Element >> User Defined Variables
We can use the __P(parameter_name,default_value) function. It's always wise to set a default value, so you can spare configuration time later on.
Let's use the parameters in the HTTP Request Defaults element like so:
Running from command line
You can run the jMeter test from command line. In order to extract the measurements, we can specify which data should be saved and how.
In the Response Times Over Time listener specify an output filename, for example: measurement.jtl. We will use xml format, because it can be consumed by a Jenkins plugin. You can either select the data saved by clicking the configure button, or you can specify the defaults in the jmeter.properties file found in the jmeter/bin folder. Set the following properties:
jmeter.save.saveservice.output_format=xml
jmeter.save.saveservice.data_type=true
jmeter.save.saveservice.label=true
jmeter.save.saveservice.response_code=true
jmeter.save.saveservice.successful=true
jmeter.save.saveservice.thread_name=true
Here is the example to run the jMeter test with the xml output and setting a parameter for the host:
jmeter -n -t path_to_jmeter_project/Backend_Performance.jmx -J host=other.someserver.com -l measurement.xml
Integration with Jenkins
We will need the Jenkins performance plugin, install it on your Jenkins: https://wiki.jenkins-ci.org/display/JENKINS/Performance+PluginCreate a new Jenkins job. Add an Execute Shell build step with the shell command we just discussed. Add a Publish Performance Test result report post-build step to set up the reporting. You can set error thresholds if you would like to. Under big load the server will probably start replying with errors.
Now you can run the Jenkins job to execute the test.
You can drill down the results by clicking Performance Trend >> Last report >> Response time trends:
Links - to read
- Wikipedia on software performance testing: http://en.wikipedia.org/wiki/Software_performance_testing
- Performance Testing Guidance for Web Applications: https://msdn.microsoft.com/en-us/library/bb924376.aspx
- jMeter component reference: http://jmeter.apache.org/usermanual/component_reference.html
- jMeter plugins documentation: http://jmeter-plugins.org/wiki/Start/