Showing posts with label Jenkins. Show all posts
Showing posts with label Jenkins. Show all posts

Sunday, June 7, 2015

GenyControl: Calabash Android + Genymotion with Jenkins integration

Genymotion

Genymotion is a widely used Android emulator, which is an alternative to the vanilla emulator bundled with the Android SDK. It runs virtual devices on Oracle VM VirtualBox making them much faster.

Calabash

Calabash is my choice of automation tool for iOS and Android devices.

GenyControl

GenyControl is a collection of shell scripts I wrote to control Genymotion devices and run Calabash tests. Its purpose is to make sure that only the requested Genymotion device is running. Also to wait until the emulator is properly started and booted before the tests are run. In this article I will explain how it works and how to use it. Feel free to clone it from GitHub:
https://github.com/madarasz/GenyControl

It is composed of two files: control_genymotion.sh and run_test.sh

control_genymotion.sh

control_genymotion.sh defines functions to start and stop emulators and wait until the requested emulators are functional. The most important functions:

  • stop_all_genymotion()
    Stops all running Genymotion devices. It sends the poweroff signal to the VMs and kills any remaining processes. This is a safe way to stop these virtual devices.
  • get_genymotions_running()
    Starts requested Genymotion device(s) and waits until they are operational. You need to pass the name of the Genymotion device you wish to use. First it waits until the device appears on the adb devices list. Then it waits until the "Android" logo disappears which signifies that the booting process is done and the device is operational.
  • get_all_genymotion_names()
    Lists the names of all available Genymotion devices.

run_test.sh

run_test.sh prepares the Genymotion device (using the functions of control_genymotion.sh) and runs the Calabash tests. In order to make it work, you need to set the following environment variables:
  • $DEVICE: device name of requested Genymotion simulator


  • $PNAME: package name of the application to be tested (i.e. com.madarasz.exampleapp)
  • $APK_PATH: path+filename of the apk to be tested (i.e. build/apk/example.apk)
  • $MORE_PARAMS: additional parameters to be used in the Calabash run command (i.e. --tags @smoke)
  • $COLORS: set it to "yes" if you want to have ANSI color codes in the output (making it prettier if your terminal supports such)
  • add the directory of adb (Android Debug Bridge) and player (Genymotion VM player - default directory on Mac: /Applications/Genymotion.app/Contents/MacOS) to $PATH

Jenkins integration - Let's put it all together

Plugins

For this example, I use the following Jenkins plugins:

Setup

The Jenkins integration goes the following way:
  1. Put control_genymotion.sh and run_test.sh into your project folder.
  2. Create a new job as a Freestyle project.
  3. Configure Source Code Management, so Jenkins will check out your project from SVN or Git. (Alternatively you set Use custom workspace in Advanced Project Options to use a local folder for the project files.)
  4. Set environment variables and enable Color ANSI Console Output in Build Environment section.


  5. Add an Execute shell build step like so:

    set +x
    source run_test.sh


    (I use "set +x" to avoid echoing every bash command in the Console Output)
  6. Configure reporting in Post-build Actions. Add both Cucumber reports:

Results

If you have done everything right, you should have:
  • nice Cucumber reports



  • colorful logs in the Console Output



  • Test Result Trend graph


Sunday, April 19, 2015

Performance testing with jMeter + Jenkins integration

Goal

In this post I will explain how to measure the performance of a web service endpoint with jMeter. We are interested in the behavior of the server using an increasing load. To make things a bit complicated, we will have to log in with each of our test users first and then request the web service endpoint in question.

Tools

jMeter

Our main tool is jMeter, which is a Java application designed to load test functional behavior and measure performance. It was originally designed for testing Web Applications but has since expanded to other test functions.

download link: http://jmeter.apache.org/download_jmeter.cgi

jMeter plugins

We will use additional components provided in the jMeter Plugins Standard Set. We will also rely on the PerfMon Server Agent to monitor server side metrics during the performance test.

download linkhttp://jmeter-plugins.org/downloads/all/
plugin installation manualhttp://jmeter-plugins.org/wiki/PluginInstall/
PerfMon Server Agent manualhttp://jmeter-plugins.org/wiki/PerfMonAgent/

Metrics

  • request load: We will define the load by number of active users who are making request to the service endpoint. (We can also define load by requests / time, I will talk about this later.) To measure performance on different load levels, we will use a staircase-like load.
  • response time: This is the main metric that we measure during the performance test, the time between the request sent and response received. As the server gets more busy, response time will get bigger.
  • error percentage: When the server reaches a critical load, it will reply with an error instead of the regular response. This will help us identify the maximum load the server is capable of serving without any errors. (Usually perceived performance is already bad at this point.)
  • server side
    • CPU usage: This gives us an idea how busy the server is with our requests.
    • Memory consumption: This might reveal memory leaks, especially if we are running long performance tests (soak/endurance tests).

Implementation

Test Plan

This is the top level component which contains the Thread groups (scenarios).

HTTP Request Defaults


to add: right click on Test Plan >> Add >> Config Element >>  HTTP Request Defaults
This defines the default server properties where all requests go to. If you are changing server, change values here.

Thread Groups


to add: right click on Test Plan >> Add >> Thread Group >> jp@gc Stepping Thread Group
This defines the staircase-like load on the server defined by the number of users.

HTTP Cookie Manager (if you want logged in users)

to add: right click on Thread Group >> Add >> Config Element >> HTTP Cookie Manager
This components stores the JSESSIONID cookies during run. You do not have to set up anything, just have it in the Thread Group.

Once Only Controller

to add: right click on Thread Group >> Add >> Logic Controller>> Only Once Controller
We will need this to make sure we perform the login request with each test user only once

HTTP Request


to add: right click on element >> Add >> Sampler >> HTTP Request
This sends the requests for the server. In our scenario, we will need two. One for the login request, create it under the Only Once Controller and another one with the actual service request that we are measuring. Define the path where the requests should go in relative to the already defined Server Defaults. Also you can add parameters if needed (for login request).

Listeners

to add: right click on element >> Listeners >> "name of listener"
These components gather and visualise the results during the performance tests.

Aggregate Report

Gathers info about all the requests sent in the Thread Group in a table fashion. Check out Min, Max, Average and Error values.

jp@gc Response Times Over Time


This provides our main measurement metrics

jp@gc Active Threads Over Time

This should show the same staircase-like figure as the Thread Group. I have this graph just for the possibility to superimpose it on the Response Times Over Time graph.

Response Time Graph


This is a more smoother version of the Response Times Over Time graph.

jp@gc Composite Graph


You can superimpose different graphs into one. Use it to combine information on Response- and Active Threads Over Time data.

View Results Tree


Use this listener when you want to inspect the actual request and response. Use it mainly during the development of performance tests, so you can validate requests. In the example above, I use it to determine the success of login request.

PerfMon Metrics Collector

By installing the jMeter Server Agent on the backend (docs: http://jmeter-plugins.org/wiki/PerfMonAgent/), you can inspect the CPU and memory consumption of the backend during the performance test. Use PerfMon Metrics Collector listener to gather data. You can also put the data on a composite graph like so:


Adding parameters

We might want to run the test with different settings, so it would be nice to parameterize some of them. In this example, we set the host name and port as a parameter.

to add: right click Test plan >> Config Element >> User Defined Variables

We can use the __P(parameter_name,default_value) function. It's always wise to set a default value, so you can spare configuration time later on.


Let's use the parameters in the HTTP Request Defaults element like so:


Running from command line

You can run the jMeter test from command line. In order to extract the measurements, we can specify which data should be saved and how.

In the Response Times Over Time listener specify an output filename, for example: measurement.jtl. We will use xml format, because it can be consumed by a Jenkins plugin. You can either select the data saved by clicking the configure button, or you can specify the defaults in the jmeter.properties file found in the jmeter/bin folder. Set the following properties:
jmeter.save.saveservice.output_format=xml
jmeter.save.saveservice.data_type=true
jmeter.save.saveservice.label=true
jmeter.save.saveservice.response_code=true
jmeter.save.saveservice.successful=true
jmeter.save.saveservice.thread_name=true
Here is the example to run the jMeter test with the xml output and setting a parameter for the host:
jmeter -n -t path_to_jmeter_project/Backend_Performance.jmx -J host=other.someserver.com -l measurement.xml

Integration with Jenkins

We will need the Jenkins performance plugin, install it on your Jenkins: https://wiki.jenkins-ci.org/display/JENKINS/Performance+Plugin

Create a new Jenkins job. Add an Execute Shell build step with the shell command we just discussed. Add a Publish Performance Test result report post-build step to set up the reporting. You can set error thresholds if you would like to. Under big load the server will probably start replying with errors.


Now you can run the Jenkins job to execute the test.



You can drill down the results by clicking Performance Trend >> Last report >> Response time trends:


Links - to read


 

Blogger news

Blogroll

About