-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limit concurrency to 1 #57
Comments
I would use ‘1 per second’ granularity. |
Hi @karolzlot I'd be curious to know what your use case is, if you can share? |
Thank you @laurentS I was thinking about using file locks, but I will also check your suggestions about mutex and semaphore. My use case: I have many endpoints which work like this:
Those endpoints are usually fired soon after new tasks are available (and also every 10 minutes just in case). In case two or more tasks will become available in the same time it could cause race conditions (each tasks would be executed two or more times which would result in for example two the same emails sent or two the same invoices issued). To not worry about race conditions the easiest solution is to limit concurrency to 1. Benefit from concurrency speedup wouldn't be important in this case anyway, and I could always use async inside endpoint if needed. Those endpoints doesn't take any REST parameters (it's by design). They just are fired and they know what to do, they only return short message to tell that everything went OK. Because I need to limit concurrency to exactly 1, I don't need to worry about solution which would work also on multi-server setup (I don't need multi-server for endpoints which are anyway limited to 1 concurrency 😄 ) In past I used two solutions for limiting concurrency:
Now I just want to do the same, but with FastAPI and without Google Cloud Run. Also those solutions limit concurrency per server, but preferably I would like to limit concurrency per endpoint. Also the best solution would be if endpoint could just wait a bit if the same endpoint is already executing at the same time. |
Now as I more read about it I think some kind of database lock may suit me better. It's not that I need to limit concurrency per endpoint, it's more that I need to limit concurrency per row in database. |
In my use case I want to limit concurrency to 1 for some endpoints to avoid race conditions.
I am looking for a limiter which will reliably not allow to have two requests at the same time to the same endpoint.
I wonder if this library will suit my needs?
The text was updated successfully, but these errors were encountered: