A stress-testing suite to validate cmd-stream resilience under high concurrency and unstable network.
- Randomly sends
Echo,Stream, andFailCommands based on configurable probabilities. - Automatically verifies that received results match the expected output.
- Periodically restarts server and introduces downtime to simulate unstable network.
- Tests the client's ability to reconnect and resume operations after server restarts.
- Uses circbrk to provide circuit-breaking capabilities.
- Has configurable long pauses to trigger and verify keepalive.
- Reports success rates, timeouts, network errors, and verification failures.
Simply execute:
# Run with default settings
go run .
# Or with custom configuration
go run . -config my-config.yamlThe test will start 10 concurrent sessions (using 4 cmd-stream clients) and
begin reporting statistics every 10 seconds. To stop, use Ctrl+C.
In config.yaml.example you can find all available configuration options.
Here is an example of the last summary report from a 12-hour run:
--- [STRESS TEST SUMMARY] ---
Total Commands: 3577792
- Success: 1905640 (53.3%) # Commands completed with verified results.
- CB Blocked: 1647007 (46.0%) # Commands prevented from sending by Circuit Breaker.
- Keepalive Triggers: 35785 # Simulated idle periods to trigger keepalive.
- Late Results: 4746 # Responses arrived after timeout.
- Send Timeouts: 0 (0.0%) # Timeout during Command send.
- Result Timeouts: 6505 (0.2%) # Timeout waiting for result.
- Network Error: 18640 (0.5%) # Connection issues (e.g. server down before CB trips).
- Unexpected Error: 0 (0.0%) # Uncategorized errors.
- Verify Error: 0 (0.0%) [CRITICAL] # Received data mismatch.
-----------------------------
Any Verify Error greater than 0 indicates a bug in the library or the test
itself and is considered a critical failure.
Note
Interpreting the results:
- QPS: The total Command count might seem low (~80 QPS) compared to raw benchmarks. This is due to artificial server delays, periodic downtimes, and client-side pauses used to simulate a realistic unstable environment.
- CB Blocked: A high
CB Blockedcount is expected. When the Circuit Breaker opens during server downtime, sending sessions enter a "tight loop" and generate many blocked attempts until the system recovers.
The focus is on verifying stability and correctness under load, not maximum throughput.