You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to use this with the Github API, but the Github API is rate limited to 5000 requests per hour and sadly, the queries that I need to do require more than 5000 requests. (It involves querying all the historical pull requests across several very active repositories.) Does this project have any sort of cache support, so that I could pull the results of 5000 requests into a Postgres table, wait an hour, and then update the table with 5000 more? It would also allow me to do more than one query per hour, which would be very helpful. :)
If the project doesn't support this, do you have any suggestions for how I could go about implementing this? Maybe point me at documentation? I'm familiar with Postgresql, but I've never used foreign data wrappers before.
The text was updated successfully, but these errors were encountered:
No, it doesn't support cache.
But you can configure your own cache server redirecting requests to github
API.
10 ÄÅË. 2014 Ç. 21:52 ÐÏÌØÚÏ×ÁÔÅÌØ "David Baumgold" < notifications@github.com> ÎÁÐÉÓÁÌ:
I want to use this with the Github API, but the Github API is rate
limited to 5000 requests per hour https://developer.github.com/v3/#rate-limiting and sadly, the queries
that I need to do require more than 5000 requests. (It involves querying
all the historical pull requests across several very active repositories.)
Does this project have any sort of cache support, so that I could pull the
results of 5000 requests into a Postgres table, wait an hour, and then
update the table with 5000 more? It would also allow me to do more than one
query per hour, which would be very helpful. :)
If the project doesn't support this, do you have any suggestions for how I
could go about implementing this? Maybe point me at documentation? I'm
familiar with Postgresql, but I've never used foreign data wrappers before.
Reply to this email directly or view it on GitHub #8.
I want to use this with the Github API, but the Github API is rate limited to 5000 requests per hour and sadly, the queries that I need to do require more than 5000 requests. (It involves querying all the historical pull requests across several very active repositories.) Does this project have any sort of cache support, so that I could pull the results of 5000 requests into a Postgres table, wait an hour, and then update the table with 5000 more? It would also allow me to do more than one query per hour, which would be very helpful. :)
If the project doesn't support this, do you have any suggestions for how I could go about implementing this? Maybe point me at documentation? I'm familiar with Postgresql, but I've never used foreign data wrappers before.
The text was updated successfully, but these errors were encountered: