Skip to content
master
Switch branches/tags
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
eis
 
 
 
 
 
 
 
 

folio-perf-test - Jenkins pipeline to test FOLIO performance

Copyright (C) 2018-2020 The Open Library Foundation

This software is distributed under the terms of the Apache License, Version 2.0. See the file "LICENSE" for more information.

System requirements

  • Jenkins on Linux with default plugins installed

  • Extra Jenkins plugins: Pipeline Utility Steps, HTTP Request Plugin, SSH Agent Plugin, CloudBees AWS Credentials Plugin, Performance

  • Install aws-cli and JMeter on Jenkins

  • Jenkins access to AWS

Test Strategies

Metrics:

This is a list of metrics that are gathered during this experiment:

  • Average response times (ART) for each transaction
  • Min and Max response times
  • Median
  • Failure rate and errors/warnings in the logs
  • Throughtput

These metrics are collected as part of Performance Report which are generated as build artifact

Tools:

Tools used for these test cases is JMeter - https://jmeter.apache.org/ Utilized non-GUI JMeter.

For example, non-GUI commandline to generate report:

jmeter -Jjmeter.save.saveservice.output_format=xml -n -l jmeter_perf.jtl -t Folio-Test-Plans/mod-inventory-storage/instance-storage/instance-storageTestPlan.jmx -j jmeter_46_instance-storageTestPlan.jmx.log

Reports are generating in Jenkins using JMeter Performance plugin - https://wiki.jenkins.io/display/JENKINS/Performance+Plugin

Tests are scheduled to run in Jenkins pipeline - https://jenkins-aws.indexdata.com/job/Automation/job/folio-perf-test/

SLA Goals:

  • The average response time (AVG RT) for the JMeter captured transaction should not be more than 1000 milliseconds.
  • The percent of CPU utilization on any module should not be more than 50%.
  • JMeter tests running nightly in Jenkins pipeline as a job will fail if even a single test fails

Environment:

  • Engine: PostgreSQL 9.6.8
  • Entire Stack(environment) is created fresh from scratch everyday by populating dataset in database then running JMeter on top of it and once tests complete running, tear down the environment.
  • JMeter scripts are running against ~3 million Harvard dataset

Workflow used to test all APIs:

  • Create new data by doing POST HTTP request, run JMeter tests and clean it by doing DELETE HTTP request once test completes.

Quick start

  • Import in Jenkins as standard pipeline project

  • Pick a Jenkinsfile and adjust build parameters as needed

Additional information

Issue tracker

See project FOLIO at the FOLIO issue tracker.

Other documentation

Other infrastructure projects are described, with further FOLIO Developer documentation at dev.folio.org

About

Jenkins pipelines to test FOLIO performance and integration

Resources

License

Releases

No releases published

Packages

No packages published