-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve batch processing functionality and timeouts handling with large amounts of data #35
Comments
@st-polina Hey thanks for filing this issue.
|
I guess the question is how much we want to abstract this in the client. Should the batch method do chunking itself, or should we make the user do it? Or should we add another method that does chunking for the user? I think the SDKs should probably handle it - allow batch to take an arbitrary number of IPs, and chunk and combine the results.
I think the issue is the client is hitting the 2s timeout, but the requests are still processing on the server. So the client cant' get the results. The bulk endpoint shouldn't have a 2s response timeout when looking up 1k IPs - it should scale with the number of IPs or something, or have a separate timeout (eg. 60s) |
Alright that makes sense, thanks for clarifying! So the action items in this issue are:
|
This is released in v4.1.0. https://pypi.org/project/ipinfo/4.1.0/ Please let me know if the client has any more feedback. |
2 issues combined into one:
Processing 50,000 IP addresses using the batch functionality provided by the library is limited to 1000 IPs per batch. Due to the limitation, users need to split it into 50 different batches. Sometimes the server might time out and the request would be left incomplete. Despite the timeout, the requests still counted for the request limit. We need to improve the batch processing functionality and timeouts handling with large amounts of data.
The text was updated successfully, but these errors were encountered: