Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache duplicate HTTP requests #7

Open
wants to merge 17 commits into
base: master
Choose a base branch
from
Open

Conversation

sirbrillig
Copy link
Owner

@sirbrillig sirbrillig commented Oct 7, 2017

This is a revised version of #5 which did not actually work.

Fixes #3

To do:

  • clean up the commits to remove the proxy-fetch stuff
  • maybe move cache testing to the tests for getNotifications rather than on fetchNotifications?
  • gettings null for the URL in gitnews-cli. Not sure why.
  • maybe also include a proxy fetch so we can use the same response for duplicate HTTP requests made at the same time.

@sirbrillig
Copy link
Owner Author

sirbrillig commented Oct 7, 2017

I think I found the problem with my proxy-fetch technique. Using Response.clone() causes Response.json() to hang forever. This seems to be a known issue in node-fetch: node-fetch/node-fetch#139

I will probably need to redesign the caching system to cache decoded response data rather than the Response object itself.

@sirbrillig
Copy link
Owner Author

Refactored to cache responses instead.

@sirbrillig
Copy link
Owner Author

Rebased onto v3

@sirbrillig
Copy link
Owner Author

The current problem, I believe, is that groups of secondary API requests are made async, and all called at once, such that it's difficult to have the data returned by one used by the others. The requests should probably be de-duplicated by URL before they are sent out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant