
็ฎไฝไธญๆ | English
LMeterX is a professional large language model performance testing platform that supports comprehensive load testing for any LLM service compatible with the OpenAI API format. Through an intuitive Web interface, users can easily create and manage test tasks, monitor testing processes in real-time, and obtain detailed performance analysis reports, providing reliable data support for model deployment and performance optimization.
- Universal compatibility: Applicable to any openai format API such as GPT, Claude, Llama, etc (language/multimodal/CoT)
- Smart load testing: Precise concurrency control & Real user simulation
- Professional metrics: TTFT, TPS, RPS, success/error rate, etc
- Multi-scenario support: Text conversations & Multimodal (image+text)
- Visualize the results: Performance reports & Model arena
- Real-time monitoring: Hierarchical monitoring of tasks and services
- Enterprise ready: Docker deployment & Web management console & Scalable architecture
LMeterX adopts a microservices architecture design, consisting of four core components:
- Backend API Service: FastAPI-based REST API service responsible for task management and result storage
- Load Testing Engine: Locust-based load testing engine that executes actual performance testing tasks
- Frontend Interface: Modern Web interface based on React + TypeScript + Ant Design
- MySQL Database: Stores test tasks, result data, and configuration information
- Docker 20.10.0+
- Docker Compose 2.0.0+
- At least 4GB available memory
- At least 5GB available disk space
Complete Deployment Guide: See Complete Deployment Guide for detailed instructions on all deployment methods
Use pre-built Docker images to start all services with one click:
# Download and run one-click deployment script
curl -fsSL https://raw.githubusercontent.com/MigoXLab/LMeterX/main/quick-start.sh | bash
- Access Web Interface: http://localhost:8080
- Create Test Task:
- Configure target API address and model parameters
- Select test type (text conversation/image-text conversation)
- Set concurrent user count and test duration
- Configure other advanced parameters (optional)
- Monitor Test Process: Real-time view of test logs and performance metrics
- Analyze Test Results: View detailed performance analysis reports and export data
SECRET_KEY=your_secret_key_here # Application security key
FLASK_DEBUG=false # Debug mode switch
DB_HOST=mysql # Database host address
DB_PORT=3306 # Database port
DB_USER=lmeterx # Database username
DB_PASSWORD=lmeterx_password # Database password
DB_NAME=lmeterx # Database name
VITE_API_BASE_URL=/api # API base path
We welcome all forms of contributions! Please read our Contributing Guide for details.
LMeterX adopts a modern technology stack to ensure system reliability and maintainability:
- Backend Service: Python + FastAPI + SQLAlchemy + MySQL
- Load Testing Engine: Python + Locust + Custom Extensions
- Frontend Interface: React + TypeScript + Ant Design + Vite
- Deployment & Operations: Docker + Docker Compose + Nginx
LMeterX/
โโโ backend/ # Backend service
โโโ st_engine/ # Load testing engine service
โโโ frontend/ # Frontend service
โโโ docs/ # Documentation directory
โโโ docker-compose.yml # Docker Compose configuration
โโโ Makefile # Run complete code checks
โโโ README.md # English README
โโโ README_CN.md # Chinese README
- Fork the Project to your GitHub account
- Clone Your Fork, create a development branch for development
- Follow Code Standards, use clear commit messages (follow conventional commit standards)
- Run Code Checks: Before submitting PR, ensure code checks, formatting, and tests all pass, you can run
make all
- Write Clear Documentation: Write corresponding documentation for new features or changes
- Actively Participate in Review: Actively respond to feedback during the review process
- Support for custom API paths and performance metrics collection
- Support for user-defined load test datasets
- Support for client resource monitoring
- User system support
- CLI command-line tool
- Deployment Guide - Detailed deployment instructions and configuration guide
- Contributing Guide - How to participate in project development and contribute code
Thanks to all developers who have contributed to the LMeterX project:
- @LuckyYC - Project maintainer & Core developer
- @del-zhenwu - Core developer
This project is licensed under the Apache 2.0 License.
โญ If this project helps you, please give us a Star! Your support is our motivation for continuous improvement.