You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
question about the decisions made in the repository
question about how to use this project
Summary
Is there a way to get the pages of a report instead of waiting for all the pages to come back? One of our clients has a couple hundred thousand keywords. Trying to wait for every page to return takes up a ton of memory, and takes quite a while causing time outs in lambda. I was wondering if there was a way to collect/return the results as they are returned from the api instead of after every page has completed. Maybe a callback that can be inserting into the page looping process?
Other information
The text was updated successfully, but these errors were encountered:
There's no way to do that at the moment, although it's something we would consider in the future. The thing to keep in mind is that if this library can't accumulate the dataset in memory, it's likely that whatever you're doing with the data afterwards would also choke, even if the data was streamed in (unless you make sure to discard the data as it is streamed in). Remember that a couple hundred thousand keywords is much smaller if you use fewer fields!
Still, this is a good feature to have, and we'd be happy to include it in this library.
Would you consider contributing a PR for this? If not, that's okay, but it may take longer for us to get around to it.
I'm submitting a ...
Summary
Is there a way to get the pages of a report instead of waiting for all the pages to come back? One of our clients has a couple hundred thousand keywords. Trying to wait for every page to return takes up a ton of memory, and takes quite a while causing time outs in lambda. I was wondering if there was a way to collect/return the results as they are returned from the api instead of after every page has completed. Maybe a callback that can be inserting into the page looping process?
Other information
The text was updated successfully, but these errors were encountered: