Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test de charge FAILED : where could this come from ? #116

Closed
2 of 7 tasks
MichaelBitard opened this issue Jun 25, 2021 · 2 comments
Closed
2 of 7 tasks

Test de charge FAILED : where could this come from ? #116

MichaelBitard opened this issue Jun 25, 2021 · 2 comments

Comments

@MichaelBitard
Copy link
Contributor

MichaelBitard commented Jun 25, 2021

  • RAM trop faible ? => probably not because the docker logs show answers are handled, but some in more than 30sec
  • timeout NGINX ?
  • timeout FAST API / gunicorn (timeout 180) => a probably not
  • timeout ES ?
  • timeout Netlify ? => probaly not
  • timeout DNS etalab ? => probaly not
  • chrome (browser) stackoverflow

Sounds like a timout of 10seconds. In the logs I see some calls answering in more that 10 seconds, so it could be linked to NGINX of FASTAPI

link deepset-ai/haystack#1228

@MichaelBitard
Copy link
Contributor Author

MichaelBitard commented Jun 25, 2021

First hint: the RequestLimiter in haystack seems to responds 503 errors with "The server is busy processing requests."

Step to reproduce:
I launched 4 parallel curl with tmux: https://asciinema.org/a/Ov6d1NmzTzv6tPRXcKInm6pcS

I found a hardcoded value of 4 parallel requests:

concurrency_limiter = RequestLimiter(4) in haystack/rest_api/controller/search.py

@guillim
Copy link
Contributor

guillim commented Jun 28, 2021

created this issue in Haystack : deepset-ai/haystack#1229

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants