Navigation Menu

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to stop load from web UI with 0.11.0 #981

Closed
solowalker27 opened this issue Mar 15, 2019 · 4 comments · Fixed by #982
Closed

Unable to stop load from web UI with 0.11.0 #981

solowalker27 opened this issue Mar 15, 2019 · 4 comments · Fixed by #982

Comments

@solowalker27
Copy link
Contributor

Description of issue

Since upgrading to Locust 0.11.0 I often have difficulty stopping Locust from the web UI when running distributed. I click the Stop button, but the load doesn't actually stop. At best, it dips briefly but then resumes.

Expected behavior

Load stops almost immediately.

Actual behavior

It seems slaves stop their load and then rejoin the pool and then have load immediately sent to them again. Example output from master, showing repeated attempts to stop the load:

[2019-03-15 22:01:51,129] 6f9c3d75aaee/INFO/locust.main: Starting Locust 0.11.0
[2019-03-15 22:02:45,732] 6f9c3d75aaee/INFO/locust.runners: Client 'e4b5991156b6_1163c8d3946a47a2954a3a19ef2ae9e0' reported as ready. Currently 1 clients ready to swarm.
[2019-03-15 22:02:45,863] 6f9c3d75aaee/INFO/locust.runners: Client '78a2ff78ec5f_6d6cced06e1140839ee19adbda4e68d3' reported as ready. Currently 2 clients ready to swarm.
[2019-03-15 22:02:45,901] 6f9c3d75aaee/INFO/locust.runners: Client '18a9ef01593d_74673a7f91a74b88ae64c796ef12acd9' reported as ready. Currently 3 clients ready to swarm.
[2019-03-15 22:02:45,932] 6f9c3d75aaee/INFO/locust.runners: Client 'eb8785101cf9_493118373e79416f8e5a76eb93b11c12' reported as ready. Currently 4 clients ready to swarm.
[2019-03-15 22:02:46,134] 6f9c3d75aaee/INFO/locust.runners: Client '5b213bf39f9d_43174c9cafcc415b98ded4e41fdff56e' reported as ready. Currently 5 clients ready to swarm.
[2019-03-15 22:02:46,146] 6f9c3d75aaee/INFO/locust.runners: Client '08e26b61419d_2dd8bd9aa5c745948c818b6b529c0df5' reported as ready. Currently 6 clients ready to swarm.
[2019-03-15 22:02:46,177] 6f9c3d75aaee/INFO/locust.runners: Client 'cc15e8e95ace_748c65bf793746048fbed3c2b613b694' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:03:38,472] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:10:56,655] 6f9c3d75aaee/INFO/locust.runners: Removing e4b5991156b6_1163c8d3946a47a2954a3a19ef2ae9e0 client from running clients
[2019-03-15 22:10:56,655] 6f9c3d75aaee/INFO/locust.runners: Client 'e4b5991156b6_1163c8d3946a47a2954a3a19ef2ae9e0' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:10:56,655] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:10:56,994] 6f9c3d75aaee/INFO/locust.runners: Removing cc15e8e95ace_748c65bf793746048fbed3c2b613b694 client from running clients
[2019-03-15 22:10:56,994] 6f9c3d75aaee/INFO/locust.runners: Client 'cc15e8e95ace_748c65bf793746048fbed3c2b613b694' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:10:56,994] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:10:57,262] 6f9c3d75aaee/INFO/locust.runners: Removing 08e26b61419d_2dd8bd9aa5c745948c818b6b529c0df5 client from running clients
[2019-03-15 22:10:57,262] 6f9c3d75aaee/INFO/locust.runners: Client '08e26b61419d_2dd8bd9aa5c745948c818b6b529c0df5' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:10:57,262] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:10:57,766] 6f9c3d75aaee/INFO/locust.runners: Removing 18a9ef01593d_74673a7f91a74b88ae64c796ef12acd9 client from running clients
[2019-03-15 22:10:57,766] 6f9c3d75aaee/INFO/locust.runners: Client '18a9ef01593d_74673a7f91a74b88ae64c796ef12acd9' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:10:57,766] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:10:57,829] 6f9c3d75aaee/INFO/locust.runners: Removing 5b213bf39f9d_43174c9cafcc415b98ded4e41fdff56e client from running clients
[2019-03-15 22:10:57,830] 6f9c3d75aaee/INFO/locust.runners: Client '5b213bf39f9d_43174c9cafcc415b98ded4e41fdff56e' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:10:57,830] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:10:57,852] 6f9c3d75aaee/INFO/locust.runners: Removing 78a2ff78ec5f_6d6cced06e1140839ee19adbda4e68d3 client from running clients
[2019-03-15 22:10:57,852] 6f9c3d75aaee/INFO/locust.runners: Client '78a2ff78ec5f_6d6cced06e1140839ee19adbda4e68d3' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:10:57,852] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:10:57,909] 6f9c3d75aaee/INFO/locust.runners: Removing eb8785101cf9_493118373e79416f8e5a76eb93b11c12 client from running clients
[2019-03-15 22:10:57,909] 6f9c3d75aaee/INFO/locust.runners: Client 'eb8785101cf9_493118373e79416f8e5a76eb93b11c12' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:10:57,910] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:14,611] 6f9c3d75aaee/INFO/locust.runners: Removing 78a2ff78ec5f_6d6cced06e1140839ee19adbda4e68d3 client from running clients
[2019-03-15 22:12:14,611] 6f9c3d75aaee/INFO/locust.runners: Client '78a2ff78ec5f_6d6cced06e1140839ee19adbda4e68d3' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:14,612] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:14,634] 6f9c3d75aaee/INFO/locust.runners: Removing eb8785101cf9_493118373e79416f8e5a76eb93b11c12 client from running clients
[2019-03-15 22:12:14,635] 6f9c3d75aaee/INFO/locust.runners: Client 'eb8785101cf9_493118373e79416f8e5a76eb93b11c12' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:14,635] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:14,639] 6f9c3d75aaee/INFO/locust.runners: Removing 18a9ef01593d_74673a7f91a74b88ae64c796ef12acd9 client from running clients
[2019-03-15 22:12:14,640] 6f9c3d75aaee/INFO/locust.runners: Client '18a9ef01593d_74673a7f91a74b88ae64c796ef12acd9' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:14,640] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:14,661] 6f9c3d75aaee/INFO/locust.runners: Removing e4b5991156b6_1163c8d3946a47a2954a3a19ef2ae9e0 client from running clients
[2019-03-15 22:12:14,661] 6f9c3d75aaee/INFO/locust.runners: Client 'e4b5991156b6_1163c8d3946a47a2954a3a19ef2ae9e0' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:14,661] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:14,684] 6f9c3d75aaee/INFO/locust.runners: Removing 08e26b61419d_2dd8bd9aa5c745948c818b6b529c0df5 client from running clients
[2019-03-15 22:12:14,684] 6f9c3d75aaee/INFO/locust.runners: Client '08e26b61419d_2dd8bd9aa5c745948c818b6b529c0df5' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:14,684] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:14,709] 6f9c3d75aaee/INFO/locust.runners: Removing cc15e8e95ace_748c65bf793746048fbed3c2b613b694 client from running clients
[2019-03-15 22:12:14,709] 6f9c3d75aaee/INFO/locust.runners: Client 'cc15e8e95ace_748c65bf793746048fbed3c2b613b694' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:14,709] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:14,826] 6f9c3d75aaee/INFO/locust.runners: Removing 5b213bf39f9d_43174c9cafcc415b98ded4e41fdff56e client from running clients
[2019-03-15 22:12:14,827] 6f9c3d75aaee/INFO/locust.runners: Client '5b213bf39f9d_43174c9cafcc415b98ded4e41fdff56e' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:14,827] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:24,572] 6f9c3d75aaee/INFO/locust.runners: Removing 18a9ef01593d_74673a7f91a74b88ae64c796ef12acd9 client from running clients
[2019-03-15 22:12:24,572] 6f9c3d75aaee/INFO/locust.runners: Client '18a9ef01593d_74673a7f91a74b88ae64c796ef12acd9' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:24,572] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:24,585] 6f9c3d75aaee/INFO/locust.runners: Removing 08e26b61419d_2dd8bd9aa5c745948c818b6b529c0df5 client from running clients
[2019-03-15 22:12:24,585] 6f9c3d75aaee/INFO/locust.runners: Client '08e26b61419d_2dd8bd9aa5c745948c818b6b529c0df5' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:24,585] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients
[2019-03-15 22:12:24,586] 6f9c3d75aaee/INFO/locust.runners: Removing 78a2ff78ec5f_6d6cced06e1140839ee19adbda4e68d3 client from running clients
[2019-03-15 22:12:24,586] 6f9c3d75aaee/INFO/locust.runners: Client '78a2ff78ec5f_6d6cced06e1140839ee19adbda4e68d3' reported as ready. Currently 7 clients ready to swarm.
[2019-03-15 22:12:24,586] 6f9c3d75aaee/INFO/locust.runners: Sending hatch jobs to 7 ready clients

Similarly, slave logs show the usual shutdown messages but then begin outputting load metrics again.

Environment settings

  • OS: Ubuntu 18.04 docker image on Ubuntu 14.04 AWS host
  • Python version: 3.6.7
  • Locust version: 0.11.0

Steps to reproduce (for bug reports)

  1. Run Locust distributed with multiple slaves
  2. Begin a large load using the web UI
  3. Wait several minutes after all users hatched
  4. Attempt to stop load using the web UI
@solowalker27
Copy link
Contributor Author

My hunch might be that this is a result of combining #970 and #927. Specifically, 28b0bc9 might be the problem.

@xiaowheat
Copy link

I run into the same issue. How could we resolve it?

@solowalker27
Copy link
Contributor Author

@xiaowheat @Eadword @timwebster9 I've got a PR out with a potential fix for this. If y'all could take a look and see if it both solves the issue for you and allows the new functionality to continue to work, that'd be great.

@cgoldberg
Copy link
Member

I just merged the PR, which closed this issue. Please test against master branch from git and and re-open this issue if it's not resolved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants