Skip to content

lobanov-qa/performance-tests

Repository files navigation

Performance Tests (Load Testing)

This project implements performance tests for the Performance QA Engineer Course stand β€” a full-featured educational banking system designed for testing and performance validation in training environments. The platform includes services such as Kafka, Redis, PostgreSQL, MinIO, Grafana, Prometheus and exposes its API via both HTTP and gRPC protocols.

Performance tests


Technologies used:

Python Locust Pydantic gRPC Docker Redis Kafka MinIO PostgreSQL Grafana Prometheus Kibana

Performance tests are written in Python using Locust and follow modern software engineering principles like SOLID, DRY, and KISS. They are designed to simulate realistic business flows and provide visibility into system performance under load.


Table of Contents


Project Overview

This performance testing framework supports both HTTP and gRPC APIs using a unified test structure.

Key components:

  • Scenarios: Represent realistic user flows, implemented via Locust user classes.
  • API Clients: Custom reusable HTTP/gRPC clients located in clients/http/ and clients/grpc/, independent of Locust internals.
  • Seeding: Automated test data generation via a flexible seeding builder, triggered through Locust event hooks based on the active scenario plan.
  • Tools: Includes generators for fake data, base configurations, and shared Locust user logic.
  • Reporting: Built-in HTML reports for Locust runs; Prometheus and Grafana metrics available via the course test stand.

Supported business scenarios include:

  • Existing user: make purchase, get documents, issue virtual card, view operations
  • New user: create account, top up card, issue physical card, retrieve account list and documents

Best Practices

This project follows industry-standard best practices:

  • SOLID design principles for maintainable client architecture
  • DRY approach to avoid duplication across protocols
  • KISS philosophy to keep scenarios readable and focused
  • Flexible structure to support both HTTP and gRPC testing
  • Reusable API clients, designed to be composable and injectable
  • The framework is easy to extend with new scenarios or client implementations as the system evolves

Getting Started

⚠️ Important: this project tests the educational platform performance-qa-engineer-course which must be running locally

1. Clone the Repository

git clone https://github.com/lobanov-qa/performance-tests.git
cd performance-tests

2. Create a Virtual Environment

Linux / MacOS

python3 -m venv venv
source venv/bin/activate

Windows

python -m venv venv
venv\Scripts\activate

3. Install Dependencies

pip install -r requirements.txt

Running Performance Tests

Each scenario can be launched via its own configuration file. The report will be automatically saved in the same directory.

Example:

locust --config=./scenarios/http/gateway/existing_user_get_documents/v1.0.conf

After test execution, open the generated HTML report: ./scenarios/http/gateway/existing_user_get_documents/report.html


Monitoring & Observability

In addition to built-in Locust reports, system-level metrics can be explored via:

These dashboards are preconfigured in the course infrastructure repository.


CI/CD

GitHub Actions integration is enabled for this project. You can execute scenarios in headless mode and publish reports to GitHub Pages automatically.

Configuration can be found in .github/workflows/performance-tests.yml.


Project Structure

performance-tests/
β”œβ”€β”€ clients/                     # API clients for HTTP and gRPC
β”‚   β”œβ”€β”€ http/                    # HTTP clients (gateway, accounts, cards, ...)
β”‚   └── grpc/                    # gRPC clients with interceptors for Locust
β”œβ”€β”€ scenarios/                   # Load scenarios (HTTP and gRPC)
β”‚   β”œβ”€β”€ http/                    # HTTP scenarios (existing_user, new_user)
β”‚   └── grpc/                    # gRPC scenarios
β”œβ”€β”€ seeds/                       # Test data generation (seeding)
β”‚   β”œβ”€β”€ builder.py               # Builder for data preparation
β”‚   β”œβ”€β”€ scenario.py              # Scenario logic for seeding
β”‚   └── schema/                  # Data schemas (plan, result)
β”œβ”€β”€ tools/                       # Helper utilities
β”‚   β”œβ”€β”€ config/                  # Configurations for HTTP/gRPC/Locust
β”‚   β”œβ”€β”€ locust/                  # Base Locust user classes
β”‚   β”œβ”€β”€ fakers.py                # Fake data generation
β”‚   β”œβ”€β”€ logger.py                # Logging
β”‚   └── routes.py                # API routes
β”œβ”€β”€ dumps/                       # Data dumps for seeding
β”œβ”€β”€ .github/workflows/           # CI/CD pipelines
β”œβ”€β”€ docker-compose.load-testing-hub.yaml # Load Testing Hub configuration
β”œβ”€β”€ requirements.txt
└── README.md

Contacts

Ready for code review, discussion of solutions and feedback.
Looking for an opportunity to start a career as an AQA engineer to grow in a team and contribute to software quality.

About

πŸš€ Load testing framework for a full-featured educational banking system. Built with Python, Locust, and Pydantic to simulate realistic load on HTTP/gRPC APIs. Features reusable API clients, automated test data seeding, and observability via Prometheus/Grafana.

Topics

Resources

Stars

Watchers

Forks

Contributors

Languages