Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(ml): added locustfile #2926

Merged
merged 1 commit into from
Jun 25, 2023
Merged

chore(ml): added locustfile #2926

merged 1 commit into from
Jun 25, 2023

Conversation

mertalev
Copy link
Contributor

Description

This PR adds support for Locust, a performance measurement tool. The benefit of this is being able to easily quantify and visualize ML performance. In turn, this makes it easier to understand the performance implications of a change, identify bottlenecks, and find the most appropriate defaults. Besides locustfile.py itself, it also adds a small Bash script to deploy the app locally, start Locust, and terminate the app once done.

Testing

This PR has been tested to work with the provided poetry.lock dependencies

  1. cd machine_learning
  2. poetry install (install poetry first if not already installed)
  3. chmod +x load_test.sh
  4. ./load_test.sh
  5. Open localhost:8089 to see the web UI
  6. Click Start swarming

@vercel
Copy link

vercel bot commented Jun 23, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
immich ⬜️ Ignored (Inspect) Jun 23, 2023 9:40pm

export NUM_ENDPOINTS=3
export PYTHONPATH=app

gunicorn app.main:app --worker-class uvicorn.workers.UvicornWorker \
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While the app normally just uses uvicorn, gunicorn has a --daemon flag so it can run in the background.

locust --host http://$HOST --web-host 127.0.0.1 \
--run-time 120s --users $(($CONCURRENCY * $NUM_ENDPOINTS)) $(if $HEADLESS; then echo "--headless"; fi)

if [[ -e $PID_FILE ]]; then kill $(cat $PID_FILE); fi
Copy link
Contributor Author

@mertalev mertalev Jun 23, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file contains the process ID for gunicorn. Terminating it like this respects other gunicorn processes that might be running in the background, whereas something like pkill gunicorn is indiscriminate.

@alextran1502
Copy link
Contributor

If you are running the development stack in Docker, can we still use the method in the description?

@mertalev
Copy link
Contributor Author

mertalev commented Jun 25, 2023

For the dev docker compose, there are a few tweaks needed:
1. locustfile.py is outside the app folder, so testing inside the container would require the mapping to be changed from ../machine-learning/app:/usr/src/app to ../machine-learning:/usr/src.
2. load_test.sh assumes there's no running deployment, but since there is you can just call locust directly.
3. To display the web UI, you'll need to expose port 8089 unless --headless is passed (in which case it just does terminal output).

I'll update it with those changes.

@mertalev
Copy link
Contributor Author

Scratch that, there's no need. If you run the dev stack then you can just run locust locally in the machine_learning folder without any args. You'd just need to do pip install locust first.

@alextran1502 alextran1502 merged commit a58482c into immich-app:main Jun 25, 2023
18 checks passed
@mertalev mertalev deleted the ml-locust branch June 25, 2023 23:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants