-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Have query return generator to reduce server stress #64
Comments
I do think it is a good idea to try and reduce the stress on the server but:
Overall I see the point in this improvement but I don't think it is too important. The largest impact it will have is on very large queries and the resulting long duration download jobs. |
It's about time I followed up on the topic since I initiated it. I agree with all of your points and now think that it would not be a practical or reasonable approach. It would make the API more complicated without providing much benefit. If the queries are spread out to a longer timeframe, e.g. when downloading a long list of products, then the possibility of the queries being affected by server downtime becomes quite likely considering the unfortunate unreliability of SciHub. On the other hand, if the first subquery succeeds then any consecutive ones are quite likely to do so as well, so it's better to take advantage of that. @willemarcel or @j08lue, I think we have reached a consensus here and you can safely close this issue. |
@valgur, in the context of the last PR you suggested to have
load_query
return a generator over the pages of long (> 100
) queries and that the download function advances only after downloading each page. This might reduce server stress, because the queries would not be sent right after each other.I am not sure how big the benefit on the server side is, but I think this would be good practice. It works only, though, if the user does not need to list the details of all products returned by the query, before he downloads them. So in the CLI
search
function, we still need to load all the pages immediately.The text was updated successfully, but these errors were encountered: