This repository contains a robust, data-driven performance test suite developed in Apache JMeter, designed to simulate realistic user load on the WebTours flight reservation application.
The primary goal of this project is to validate the application's scalability and identify potential performance bottlenecks under concurrent user traffic.
The test script simulates a complete, end-to-end business transaction flow, structured using Transaction Controllers for precise timing measurements:
- Login: User authentication using unique credentials.
- Flight Search: Submitting search criteria for flights.
- Flight Selection: Choosing a specific outbound and inbound flight.
- Reservation: Providing passenger details and submitting the final booking.
- Sign Off: Securely logging out of the application.
| Feature | Implementation Detail | Purpose |
|---|---|---|
| Data-Driven Testing | Uses a CSV Data Set Config to inject data for 5 concurrent users (Username, Password, Passenger Count, Flight Class, etc.) | Ensures each thread uses unique, dynamic data for every iteration. |
| Correlation | Employs a Regular Expression Extractor to dynamically capture the crucial userSession ID from the login response | Maintains session integrity and prevents functional failures under load |
| Session Management | HTTP Cookie Manager configured to handle cookies automatically | Maintains the state of each virtual user throughout the booking workflow |
| Flow Control | Transaction Controllers used for distinct scenarios (Login, Reservation, Sign-Off) | Provides meaningful response time metrics per business transaction |
| Validation | Response and Duration Assertions at critical steps (e.g., Login) | Confirms functional correctness and SLA adherence |
- CLI Execution: Configured for command-line execution to ensure efficiency during large load tests.
- Raw Output: Results saved to a
.jtlfile (CSV format). - Performance Dashboard: HTML reports generated post-execution using the
-e -oflags, providing:- Aggregate Report (Average, 90th Percentile, Throughput)
- Time-based charts (Transactions per Second, Response Times, Latency)
JMeter Listeners used:
- View Results Tree: For inspecting request/response payloads, validating correlation, and confirming assertion hits.
- Aggregate Report: For summarizing key performance metrics.
Scalability Testing (Exploratory): Tested the script on BlazeMeter with 50 virtual users to evaluate large-scale load handling. Partial execution highlighted areas for optimization under high concurrency.
Key performance indicators and successful run details:
- Aggregate Report Summary
- View Results Tree (Successful Transaction)
- Assertion Results
For the complete set of Test screenshots, refer to
JMeter_Test_Results.docx.
-
Clone the repository:
"```bash" git clone https://github.com/harsh7736/JMeter-Performance-Tests.git -
Open JMeter (Apache JMeter 5.x recommended).
-
Load the .jmx file from the JMX_Files folder.
-
Ensure CSV data files are placed in CSV_Data/.
-
Run the Test Plan: Execute locally or in CLI mode using:
jmeter -n -t JMX_Files/WebTools_FlightTest.jmx -l HTML_Reports/results.jtl -e -o HTML_Reports/dashboard
-
View Results: Open generated HTML dashboard or refer to the Word document for screenshots.
##Author
Harsh Singh – Software Engineer | QA Automation & Performance Testing Enthusiast
Passionate about test automation, performance testing, and optimizing application scalability
Skilled in JMeter, Selenium, CI/CD pipelines, and data-driven testing
GitHub: https://github.com/harsh7736
LinkedIn: https://www.linkedin.com/in/harsh-singh



