Performance test dashboard
Permalink
Failed to load latest commit information.
app added logging to Jenkins call Sep 14, 2017
config added process id to cookie Apr 11, 2017
demo Added gatling errors to demo/targets-io-demo.json Feb 3, 2017
kubernetes cosmetic text changes Sep 13, 2017
public fix for hidden tab items Mar 8, 2018
wiki Added wiki image Feb 6, 2017
.bowerrc initial commit in new repo Sep 2, 2015
.csslintrc initial commit in new repo Sep 2, 2015
.editorconfig initial commit in new repo Sep 2, 2015
.gitignore cosmetic text changes Sep 13, 2017
.jshintrc code improvements based on jslint Nov 4, 2015
.slugignore initial commit in new repo Sep 2, 2015
.travis.yml initial commit in new repo Sep 2, 2015
Dockerfile migrated to node6, made redis connection resilient, only use redis fo… Feb 24, 2017
LICENSE.md Update LICENSE.md Sep 28, 2016
Procfile initial commit in new repo Sep 2, 2015
README.md Merge remote-tracking branch 'upstream/master' Sep 13, 2017
Vagrantfile Update Vagrantfile Nov 16, 2017
bower.json fixed version for angular-aria Mar 9, 2017
config.rb initial commit in new repo Sep 2, 2015
docker-compose.yml removed tag Feb 6, 2017
docker-entrypoint.sh Simplified env variable names Dec 12, 2016
fig.yml initial commit in new repo Sep 2, 2015
generate-ssl-certs.sh initial commit in new repo Sep 2, 2015
gruntfile.js initial commit in new repo Sep 2, 2015
init-graphite-container-volumes.sh cosmetic text changes Sep 13, 2017
karma.conf.js initial commit in new repo Sep 2, 2015
mongo-setup.js improved mongo setup Mar 9, 2017
package.json fixed versions May 11, 2017
server.js changed to clearInterval Apr 4, 2017
set-ip-docker-compose-up.sh added startup shell script Feb 3, 2017

README.md

targets-io

Performance test dashboard

Dashboard app for organizing, analyzing, benchmarking and reporting of test results of performance tests executed with Gatling Tool, JMeter or LoadRunner (see LR2Graphite and LRLauncher). The load related metrics are stored in Graphite together with for instance resource usage metrics of the application under test. Any metric stored in Graphite can be benchmarked between test runs, to provide automated assertions on the performance of an application when running tests from a continuous integration pipeline.

Automated test result analysis and benchmarking

The targets-io dashboard can automatically benchmark test results to prevent you from spending a lot of time analysing your continuous performance test runs. The dashboard does the following benchmarks:

  • Requirements: check if the results meet the requirements set for a metric.
  • Benchmark to previous test run: check if metric deviations, compared to the last test run, stay within configured allowed deviation. This benchmark will give you immediate feedback on the last commits on your code.
  • Benchmark to fixed baseline: check if metric deviations, compared to fixed baseline (for instance your current live version), stay within configured allowed deviation.

The requirements / benchmark thresholds can be set on any of the metrics you have configured in your dashboard. The consolidated test run results are exposed via a REST API and can be used to pass or fail your build in your CI server.

Demo

To set up a local demo environment take the following instruction steps for Linux Ubuntu:

  • Install docker
  • Install docker compose
  • Clone this repository: git clone https://github.com/dmoll1974/targets-io.git
  • Change directory into targets-io: cd targets-io
  • Run init script to prepare Graphite volumes on host: sudo ./init-graphite-container-volumes.sh
  • Run start up script: sudo ./set-ip-docker-compose-up.sh localhost # set host ip here if running on server

or

Another approach is to use Vagrant and VirtualBox to create a virtual machine. This way the setting up of the environment is completely automated. You can use the following steps:

  • Install Vagrant and VirtualBox
  • Install the vagrant-docker-compose plugin using the command line: vagrant plugin install vagrant-docker-compose
  • Use this Vagrantfile to generate a box. Place file in a directory of your choosing and use command line in that directory: vagrant up.

shortcut using curl: curl -O https://raw.githubusercontent.com/dmoll1974/targets-io/master/Vagrantfile && vagrant up

The end result will be 9 started docker containers:

Container Description Exposed port
targets-io Performance dashboard application 80
mongodb Database to store dashboard configurations 27017
graphite Time based series database 8070
jenkins CI server to start demo scripts 8080
mean Demo application to run performance tests against 3001
redis Used for caching calls to Graphite -
logstash Used for parsing Gatling logs -
graylog Used for browsing Targets-io, demo app and Gatling logs 8090
elasticsearch Used by Graylog -

Open the targets-io performance dashboard via

http://localhost

First restore the pre-configured demo dashboard configurations via the menu in the right top of the screen

Select the configuration file from the repo (targets-io/demo/targets-io-demo.json) and click "upload me". After reloading the page you should see one "Product": "MEAN"

To start one of the demo scripts open the Jenkins console

http://localhost:8080

Log in using the credentials admin/targets-io

Assertions demo

To see a demo of the automated assertion of benchmark results of a test run, start the TARGETS-IO-GATLING-DEMO job (click "Build now"). This will trigger the Gatling demo script

  • After the first run has finished, go to http://localhost/#!/browse/MEAN/GATLING-NIGHTLY/to check the results.
  • Rerun the the TARGETS-IO-GATLING-DEMO job.
  • When this build passes it means all your configured requirements / benchmark thresholds (see explanation above) have passed for this run. If the job fails, check the job logs to find out why and examine http://localhost/#!/browse/MEAN/GATLING-NIGHTLY/ to investigate. You can drill down the consolidated results by clicking on the passed/failed icons.

Graylog integration demo

In the demo environment the Gatling logs are parsed by Logstash and send to Graylog (and Graphite). The demo application also sends logs to Graylog, so you can correlate errors in Gatling to errors logged in the application. In order to do this you have to manually enable a listener in Graylog:

  • Log into Graylog with credentials admin/admin
  • Select System - Inputs from the menu
  • Select GELF UDP from the dropdown list and click "Launch new input"
  • Provide a name, select the node and launch the input.

To correlate Gatling and application errors use the following search query:

type:gatling_log AND facility:MEAN

You can also drilldown from Targets-io to Graylog from the Gatling - errors tab in the graphs view for a test run.

JMeter integration demo

The TARGETS-IO-JMETER-DEMO job triggers the JMeter demo script to demo how to integrate a JMeter script in the framework.

LoadRunner integration demo

The TARGETS-IO-LOADRUNNER-DEMO job triggers the LoadRunner TruClient demo script to demo how to integrate a LoadRunner script in the framework. This requires a Windows Jenkins Agent to be connected as "LOADRUNNER-SLAVE". Please refer to the LR2Graphite documentattion on how to setup this machine.

Update your demo environment

The targets-io image on Docker hub is updated frequently. To deploy the latest version in your demo environment, use sudo docker-compose stop targetsio, sudo docker-compose pull targetsio and sudo docker-compose up -d to update!

Documentation

Wiki (in progress)

Libraries / Dependencies

License

GNU GPL