Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduce a simple prefetching solution #957

Merged
merged 18 commits into from Feb 15, 2017
Merged

Conversation

arunoda
Copy link
Contributor

@arunoda arunoda commented Feb 2, 2017

Basically this is a universal solution which works everywhere.
We simply invoke a XHR for all the prefetching URLs.

But we don't parse JSON until the it's being used.

Fixes #956

lib/prefetch.js Outdated

PREFETCHED_URLS[url] = fetch(url)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could try to do this after the load event.
When we reach this page, we already passed DOMContentLoaded.

Since we don't do any processing, this should fine.

@sarukuku
Copy link

sarukuku commented Feb 2, 2017

This is related to #956. 🤖

@timneutkens
Copy link
Member

@sarukuku added to the introduction text so it's linked 👍

@arunoda
Copy link
Contributor Author

arunoda commented Feb 2, 2017

@rauchg I did a load test. Basically, prefetch a lot of pages in a setInterval.

This is the Chrome's timeline profiler view.

screen_shot_2017-02-02_at_6_31_24_pm

Basically, it spend a pretty small(micro-secs) time to send the request and get it.
It's actually, smaller than the setInterval call.
(So, I think with the performance wise this is fine)

const { Component, err, xhr } = routeInfo.data = await this.fetchComponent(route)
const ctx = { err, xhr, pathname, query }
const { Component, err } = routeInfo.data = await this.fetchComponent(route)
const ctx = { err, pathname, query }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we changed ctx ? I remember we use xhr for on error page.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reason is the prefetched version doesn't have a xhr object. It'll see more about this.

lib/prefetch.js Outdated
return (typeof navigator !== 'undefined' && navigator.serviceWorker)
}
// Add fetch polyfill for older browsers
import 'whatwg-fetch'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder importing this module is ok since it would be loaded on server too ?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isomorphic-fetch maybe?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nkzawa loading it's okay. But I think we need to guard it for the client side.
@mikker Yep. it would work, but we don't use this in the server side.

@nkzawa
Copy link
Contributor

nkzawa commented Feb 2, 2017

I wonder how this will affect the limit of concurrent HTTP requests on browser. (Is there the limit if we use service worker ?)

@arunoda
Copy link
Contributor Author

arunoda commented Feb 2, 2017

I wonder how this will affect the limit of concurrent HTTP requests on browser. (Is there the limit if we use service worker ?)

@nkzawa I think I need to find more about that.

@rauchg
Copy link
Member

rauchg commented Feb 2, 2017

By the way, if we're going to ditch the SW, I'm ok with prefetching by default and ditching next/prefetch.

@arunoda
Copy link
Contributor Author

arunoda commented Feb 2, 2017

@rauchg I am not sure about that.
We can move next/prefetch logic into somewhere elas.
But I assume, user still has to tell us what to prefetch or not.

@FrancosLab
Copy link

Does ditching the service worker cost us any gains/losses vs using the service worker?

@rauchg
Copy link
Member

rauchg commented Feb 3, 2017

We think this only has upsides so far

@arunoda
Copy link
Contributor Author

arunoda commented Feb 3, 2017

@nkzawa actually it depends. Usually, it's around 4-8 connections per origin.

So, yes there's an issue with that. (But it applies for SW as well)
But I couldn't find any docs saying SW and limitations.

But with HTTP2 there won't be such problem because of multiplexing. But that'll affect the time to other resources in that origin.

So, here's my idea.

  1. We can throttle to load just 2 JSON pages at once.
  2. I'll move the codebase into the router.
  3. I'll add prefetch APIs to the next/link.
  4. Let's deprecate next/prefetch

@msand
Copy link

msand commented Feb 3, 2017

@nkzawa @arunoda

Even with HTTP2 multiplexing you have some device dependent limits on the number of concurrent requests, but it goes much higher. It can become limited by the memory allocated for the responses, somewhere around 25mb of pre-allocated header memory can easily cause problems even on desktop browsers. But with that you can easily afford 100+ concurrent requests, which should cover most needs. If you need more, it makes sense to use websockets and keep track of requests and responses in js.

These limits should apply both to service workers and other http requests. Also, when running out of your per-process memory, you will get e.g. net::ERR_INSUFFICIENT_RESOURCES in Chrome/ium.
https://chromium.googlesource.com/chromium/src/+/58.0.3001.2/content/browser/loader/resource_dispatcher_host_impl.cc#2341

Anyway, throttling prefetching at 2 concurrent requests is probably a sane default, considering mobile.

@arunoda
Copy link
Contributor Author

arunoda commented Feb 6, 2017

@nkzawa Implemented what I suggested here.

I've changed the core router API to use fetch always.
That's cleaner.

I need to do few more tests and write some test cases.
In the meantime, could you check this?

lib/prefetch.js Outdated
if (this.props.prefetch !== false) {
prefetch(href)
if (!linkPrinted) {
const message = '> You are using deprecated "next/prefetch". It will be removed with Next.js 2.0.\n' +
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nkzawa I suggested to drop this on 2.0 final release. I hope that's okay.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to @arunoda

@nkzawa
Copy link
Contributor

nkzawa commented Feb 6, 2017

I think xhr and fetch is not equivalent when you need to abort requests. xhr's abort should be more efficient ?

@@ -185,7 +194,7 @@ export default class Router extends EventEmitter {

try {
const { Component, err, xhr } = routeInfo.data = await this.fetchComponent(route)
const ctx = { err, xhr, pathname, query }
const ctx = { err, pathname, query, xhr }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should rename xhr if we replace it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah. That's on my todo list.
How about "jsonPageRes"

@arunoda
Copy link
Contributor Author

arunoda commented Feb 7, 2017

@nkzawa Yep. With fetch we can't really abort. My suggestion is we don't need to abort.
With this, it's discard the response.

Usually, it's very rare xhr abort to happen. Even we do that, it might have reached the server.
So we don't get a big win.

Since we also do prefetching and has aggressive caching this won't be a problem.

@arunoda
Copy link
Contributor Author

arunoda commented Feb 7, 2017

@nkzawa This is ready.

#### With `<Link>`

You can substitute your usage of `<Link>` with the default export of `next/prefetch`. For example:
You can add `prefetch` prop to any `<Link>` and Next.js will prefetch those pages in the background.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking prefetching is the default behavior. What's the reason ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to do that? I think it's better users turn it on.
So, we have no surprises.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 makes sense

lib/prefetch.js Outdated
let { pathname } = urlParse(href)
const url = `/_next/${__NEXT_DATA__.buildId}/pages${pathname}`
let apiPrinted = false
let linkPrinted = false
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'd like to use execOnce in ./util instead of having each flag.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll use it.

// contain a map of response of prefetched routes
this.prefetchedRoutes = {}
this.prefetchingLockManager = new LockManager(2)
this.prefetchingRoutes = {}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we can simplify this state managements by having just Promise of fetch ?
Additionally, I think state of prefetch and normal fetch results can be managed by an object.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, we can't simplify. But we could move the isolate the fetching logic and move it to some other module. Since we are fetching in parallel, we need to keep these states.

Do you wanna move the fetching logic to somewhere else?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe that's because of LockManager ? Then I will try to refactor after this was merged.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nkzawa even with it or without it, since we are fetching parallel we've to do it.
In the previous version, we hadn't that check. So, it was possible to fetch the same page multiple times.

Copy link
Contributor Author

@arunoda arunoda Feb 8, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me give more about these states:

  • prefetchedRoutes - Store already prefetched routes
  • prefetchingRoutes - Keep track of currently prefetching routes. So, when we asked again to prefetch the same route(which it's prefetching), it won't happen.
  • prefetchingLockManager - Make sure, we only fetch two routes at once.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we shouldn't fetch the same page multiple times ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nkzawa we don't. But user could ask to do that. That's why we need to keep these states.

For an example, let's say user asking us to fetch following pages:

<Link prefetch href="/abc?aa=10" as="/abc/10" />
<Link prefetch href="/abc?aa=20" as="/abc/20" />
...

But we only need to fetch /abc once. That's why.

lib/utils.js Outdated
@@ -53,3 +53,31 @@ export async function loadGetInitialProps (Component, ctx) {
}
return props
}

export class LockManager {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can use https://github.com/timdp/es6-promise-pool or other existing modules. I feel the idea of lock doesn't fit to JS :D

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to use an existing lib. But the promise pool and others similar doesn't work. They are good, if we know the amount of jobs we've do.

In our case, we don't know how many prefetches we need and when they arrive. So, what we need is a continuously running job manager with a max cap for parallel jobs.

Frankly, I couldn't find a something else.
Anyway, I like locks and would love to use them with JS.
Unlike other languages, memory accessing is thread safe :) So, I love it.

Anyway, if you know a lib which does what we want, let's use it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

es6-promise-pool looks you don't need to know the amount of jobs? And you can just recreate a pool if all existing jobs done.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nkzawa But then we've to wait all the jobs getting done no? Usually we are getting requests by one by one.
Then we may need to do a setTimeout and check all the prefetches happening in the same event loop and so on.

But I thought this was nice than that.

@arunoda
Copy link
Contributor Author

arunoda commented Feb 15, 2017

Let's take this and keep iterate on this.

@arunoda arunoda merged commit 14c86be into vercel:master Feb 15, 2017
@arunoda arunoda deleted the new-prefetcher branch February 15, 2017 08:52
@HaNdTriX HaNdTriX mentioned this pull request Feb 26, 2017
@lock lock bot locked as resolved and limited conversation to collaborators Jan 18, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

8 participants