Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improved batch op support #4

Open
UmanShahzad opened this issue Dec 23, 2020 · 2 comments
Open

Improved batch op support #4

UmanShahzad opened this issue Dec 23, 2020 · 2 comments

Comments

@UmanShahzad
Copy link
Contributor

Take into account use cases like ipinfo/python#35 to ensure batch ops support is scalable & robust.

@HavenDV
Copy link
Contributor

HavenDV commented Dec 23, 2020

I will add soon.
But there are two ways - sequentially receiving 1000 at a time or splitting the request into parallel ones? Will the backend handle massive parallel requests?

@UmanShahzad
Copy link
Contributor Author

UmanShahzad commented Dec 23, 2020

The backend can handle lots of requests in parallel. We can limit the concurrency level in the client, e.g. max 10 batches of 1000-sized requests by default.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants