Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make data-page hidration a new request #1878

Closed
PillFall opened this issue May 19, 2024 · 7 comments
Closed

Make data-page hidration a new request #1878

PillFall opened this issue May 19, 2024 · 7 comments
Assignees
Labels
investigate The issue needs further investigating

Comments

@PillFall
Copy link

PillFall commented May 19, 2024

Version:

  • @inertiajs/vue3 version: 0.x and 1.x

I believe this applies to another Inertia drivers as well.

Describe the problem:

I published #1566 a while ago in the discussion page with no response, so I decided to put this as an issue, due to this could be considered a bug or design flaw.

When the page load for the first time, all the data needed for rendering are written as JSON inside the data-page attribute in the root element. This is ok, I know the JSON parser is pretty fast.

But...

From what I have been able to test, if this data is relatively big (like 1000 chars), it could make the DOM parsing significantly slower, triggering bad performance, increasing the FCP in more than 3 seconds, on low-end devices.

The only thing I could attribute this is all the page data stored in the data-page attribute. If I load the same page without the bulked data-page attribute, the FCP in under 1.5 seconds at most. I used VueJS in both tests.

I propose, instead of loading the data in the data-page attribute for the first load, fetch the data as a new XHR request, as the subsequent page visits are. This wouldn't differ from what an SPA does.

I know the backend (probably) must handle the same exact request twice, but we can control the backend processing speed and power, but not the frontend. And this can be easily mitigated with a middleware.

This will also allow us to send even more information (and maybe even quicker) to the frontend to process. And we get in addition a cleaner and smaller HTML, avoiding data duplication in the DOM.

@peter-emad99
Copy link

what about inline script tag with defer attribute ?

@RobertBoes
Copy link
Contributor

Making an additional request on load seems rather pointless to me, it would also require a rework of the Inertia protocol and is quite a significant breaking change. Inertia is inherently backend-driven, so all the data is already present on the initial request (such as authentication, authorization checks etc), only then to make another request for pretty much the same data.

One option I'd see, similar to what @peter-emad99 is suggesting; make the way initial data is resolved user-configurable. So instead of these lines

const el = isServer ? null : document.getElementById(id)
const initialPage = page || JSON.parse(el.dataset.page)
that could be done through a callback. I'd imaging something along the lines of this:

<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0" />
    @vite('resources/js/app.js')
    <script>
        const inertiaPage = {{ Illuminate\Support\Js::from($page) }};
    </script>
    @inertiaHead
  </head>
  <body>
    @inertia(data: false)
  </body>
</html>
createInertiaApp({
  	resolvePage: () => inertiaPage
});

Then you could still use an additional XHR request, although the way Inertia renders the page it would do a lot of pointless work

@driesvints driesvints added the enhancement New feature or request label May 21, 2024
@driesvints driesvints assigned reinink and unassigned reinink May 21, 2024
@driesvints driesvints added investigate The issue needs further investigating and removed enhancement New feature or request labels May 21, 2024
@PillFall
Copy link
Author

all the data is already present on the initial request (such as authentication, authorization checks etc), only then to make another request for pretty much the same data.

This could not be mitigated with the Middleware priority?

If we process the Inertia middleware first, we can check if the request is an Inertia request (I assume via the X-Inertia header) or a normal browser visit. And perform one action (Load the base HTML) or another (continue to the controller).

I barely know how Inertia works, so I don't know if this is possible, but I think it is doable.

@reinink
Copy link
Member

reinink commented May 22, 2024

Hey there!

So I did some benchmarking to see if the fact that we're using the data-page attribute is actually causing slowdowns. From all my testing that doesn't appear to be the case. Some details...

Parsing JSON from data attribute

First I wanted to test to see how long it takes the browser to get the data from the data attribute and parse it. Here's how I tested that:

console.time('Parse data attribute');
const data = JSON.parse(document.getElementById('app').dataset.page)
console.timeEnd('Parse data attribute');

This is effectively the exact same thing Inertia.js does internally.

In my first test I sent a enormous 2mb data payload from my controller, and then used Chrome's CPU throttling to slow things down 6x, and the parsing of the data attribute took 17ms.

I also tested this with a more realistic example — the largest page on Tailwind UI (https://tailwindui.com/components), which has a pretty large data payload. This parsed in 0.91 ms with the 6x CPU slowdown applied.

So based on these tests, it does not appear that the actual parsing of the data attribute is an issue. Even with a 6x CPU slowdown, our real world example parsed in less than 1ms, which is crazy fast.

Data attribute vs script tag

But maybe it's not the parsing of the data attribute in JavaScript that's the issue, but rather the initial HTML payload that's causing a slow initial render.

In this case, I wanted to know if the browser parsed the HTML with the data attribute slower than if we just provided the data as a ready-to-go JavaScript object.

For this test I created two simple files, one with a data attribute, and another with a script tag, both with the same 2mb data payload:

Data attribute test file:

<html>
<body>

<div id="app" data-page="{...}"></div>

<script>
  const data = JSON.parse(document.getElementById('app').dataset.page)
  document.getElementById('app').innerHTML = data.props.auth.user.name
</script>

</body>
</html>

Script test file:

<html>
<head>
<script>
    var data = {...}
</script>
</head>

<body>

<div id="app"></div>

<script>
  document.getElementById('app').innerHTML = data.props.auth.user.name
</script>

</body>
</html>

When running this test Lighthouse gave both tests the exact same performance score, and both had the exact same first contentful paint.

Conclusion

So, based on these tests I don't believe that there are any performance issues specifically being caused by passing data via a data attribute to Inertia.js on the initial page load.

I think the more reasonable cause for the slow FCP is simply the size of the data payload being sent to the browser. Whether you're using Inertia.js or sending plain HTML to the browser, if you send a ton of data it's going to take the browser some time to parse and render that payload.

Inertia.js doesn't somehow magically make these kind of performance bottlenecks go away. My recommendation has always been to be careful how much data you send from your controller to your page components.

Also, I don't think that it generally makes sense to load this data via a secondary request — a single request is always going to be faster than two requests, plus there are SEO benefits to this if you're using SSR. That said, if you want to load this data via a secondary request, Inertia.js has a feature for this...

Lazy data evaluation

If your app does require that a ton of data be rendered on a particular page and you want to speed up that initial page visit, I'd recommend looking at Inertia's "lazy data evaluation" feature: https://inertiajs.com/partial-reloads#lazy-data-evaluation

Using the Inertia::lazy() callback you can defer passing certain data to a component on the initial visit to a page, and then manually load that data using a partial reload.

For example, this is what you're controller might look like:

return Inertia::render('Users/Index', [
    'search' => Request::input('search'),
    'users' => Inertia::lazy(fn () => $filteredUsers),
]);

And then in your Vue page component, you can lazily load this data using a partial reload once the component mounts:

<script setup>
import { Head, router } from '@inertiajs/vue3'
import { onMounted } from 'vue'

defineProps({ users: Array })

onMounted(() => {
  router.reload({ only: ['users'] })
})
</script>

You just need to make sure that your Vue component gracefully handles the situation where the users prop is undefined (before the users are loaded via the partial reload).

Here's what this looks like in our Vue playground:

Screen.Recording.2024-05-22.at.8.54.53.AM.mov

Hope that helps! 👍

@reinink reinink closed this as completed May 22, 2024
@PillFall
Copy link
Author

But maybe it's not the parsing of the data attribute in JavaScript that's the issue, but rather the initial HTML payload that's causing a slow initial render.

@reinink That is just what I mention, that is the reason I want to get rid of the data-page, I know the JSON parsing is fast, but the JSON parsing is not the problem, is the DOM load, or creation, is a «Browser»-kind of issue.

When the FCP vital is set, not even the first component from Vue or the data-page has been loaded, the only thing that is downloaded and parsed is the HTML. That is the thing which is slowing down the FCP.

I'm using pagination and lazy load to reduce the most I can the data sent before the page load, still do not know why this is happening, but as I said, this does not have anything to do with JS.

Without data-page

without data-page

With data-page

with data-page

Notice that for a simple login view, Lighthouse is not able to load the FCP as the page took almost 13 or 16 seconds to load in a low-end device (Phone from late 2005, we have users with even older devices) (The TTFB is only 25ms)

@PillFall
Copy link
Author

@reinink

@nick-potts
Copy link

I wrote a mockup along this vein.

#1887

Instead of it being a separate request later, or a slow request initially, it could instead be streamed request with data resolved when its available - the best of both worlds.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
investigate The issue needs further investigating
Projects
None yet
Development

No branches or pull requests

6 participants