Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for concurrent requests? #23

Closed
steveocrypto opened this issue Apr 4, 2021 · 2 comments · Fixed by #29
Closed

Support for concurrent requests? #23

steveocrypto opened this issue Apr 4, 2021 · 2 comments · Fixed by #29

Comments

@steveocrypto
Copy link

steveocrypto commented Apr 4, 2021

Reference: kwhitley/apicache#33

Having an issue with the original library not handling concurrent requests. Is this supported in apicache-plus?

@arthurfranca
Copy link
Owner

Yes, it is supported, but not exactly how issue author describes it.

  1. First request will start caching response while preventing next requests from doing the same.
  2. The other requests will receive uncached responses while the first one is finishing caching. (This is the difference. Better than waiting for cache to be ready, which could take time if the first connection is slow, as caching finishes when response ends)
  3. After cache is finished, new requests will receive cached responses.

@arthurfranca
Copy link
Owner

Revisiting it i see that the "2." part is how it probably should behave for GET requests (so to not wait for e.g. a full video download), but it isn't right now. At the moment, all concurrent requests will wait for the first request that acquired a lock to finish caching/responding. So it will be delayed till the first one sends response, exactly what kwhitley/apicache#33 and @steveocrypto wanted.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants