Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add measures and logging for LCP diagnostics #112

Closed
rviscomi opened this issue Apr 7, 2023 · 1 comment · Fixed by #115
Closed

Add measures and logging for LCP diagnostics #112

rviscomi opened this issue Apr 7, 2023 · 1 comment · Fixed by #115
Assignees
Labels
enhancement New feature or request

Comments

@rviscomi
Copy link
Member

rviscomi commented Apr 7, 2023

The Optimize LCP guide describes how to instrument DevTools with additional LCP diagnostic info. It neatly logs each diagnostic in a table and uses performance.measure() to show them in the Performance panel.

Figure out whether upgrading to the attribution build in #110 is sufficient for logging the diagnostics, or whether we should add any extra table formatting like the example in the guide.

The measure feature should be gated behind the same User Timings option used by INP.

@rviscomi rviscomi added the enhancement New feature or request label Apr 7, 2023
@tunetheweb
Copy link
Member

Should be easy enough to edit this code after #115 is merged:

function addUserTimings(metric) {
switch (metric.name) {
case "LCP":
// LCP has a loadTime/renderTime (startTime), but not a duration.
// Could visualize relative to timeOrigin, or from loadTime -> renderTime.
// Skip for now.
break;

To do similar to what INP does, using the attribution breakdown data:

case "INP":
if (metric.entries.length > 0) {
const inpEntry = metric.entries[0];
// RenderTime is an estimate, because duration is rounded, and may get rounded keydown
// In rare cases it can be less than processingEnd and that breaks performance.measure().
// Lets make sure its at least 4ms in those cases so you can just barely see it.
const presentationTime = inpEntry.startTime + inpEntry.duration;
const adjustedPresentationTime = Math.max(inpEntry.processingEnd + 4, presentationTime);
performance.measure(`[Web Vitals] INP.duration (${inpEntry.name})`, {
start: inpEntry.startTime,
end: presentationTime,
});
performance.measure(`[Web Vitals] INP.inputDelay (${inpEntry.name})`, {
start: inpEntry.startTime,
end: inpEntry.processingStart,
});
performance.measure(`[Web Vitals] INP.processingTime (${inpEntry.name})`, {
start: inpEntry.processingStart,
end: inpEntry.processingEnd,
});
performance.measure(`[Web Vitals] INP.presentationDelay (${inpEntry.name})`, {
start: inpEntry.processingEnd,
end: adjustedPresentationTime,
});
}
break;

However currently the extension uses the normal build, so ideally we'd move to the attribution build to get this. Anyone any concerns with this? I think it's a small bit extra processing, but will give us better data.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants