Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: byte efficiency estimates based on average throughput #1536

Merged
merged 7 commits into from
Jan 27, 2017

Conversation

patrickhulce
Copy link
Collaborator

@patrickhulce patrickhulce commented Jan 25, 2017

addresses #1469 and several tasks of #1517

Switches the approach used previously from adding up the potential ms of each individual resource and instead estimating the time savings based on the average throughput across all requests and the total bytes saved. The previous approach had similar shortcomings to our current links-blocking-first-paint audit where we would report savings longer than the load of the page (because in reality these assets are loaded in parallel, not serially). This should give a more robust and realistic estimate of how many total milliseconds can be shaved off of the load time.

Before:
image

After:
image

@brendankenny
Copy link
Member

brendankenny commented Jan 26, 2017

This is very cool.

Instead of living on NetworkRecords, does computeAverageThroughput make sense as a computed gatherer?

Would it also make sense to have an average time per domain? That makes certain questions slightly harder to answer (was the connection to a domain in some interval server limited or was it because there were a bunch of connections open to some other domain), but it clears up other things, like recognizing the use of a fast CDN for static assets while ads on the same page are loaded from a much slower server.

@patrickhulce
Copy link
Collaborator Author

Instead of living on NetworkRecords, does computeAverageThroughput make sense as a computed gatherer?

Good point I totally forgot about those :) yes, I'll convert it

Would it also make sense to have an average time per domain?

I actually started with this, but ended up just going back to all requests for a couple reasons:

  • Until we have network request attribution for all data URI images and stylesheets we'll still need to compute the total average throughput, so start there to keep it simple.
  • If we accept my wild assumption that the client is typically the throughput bottleneck, the differences between origins wouldn't matter much.
  • The way this metric gets used in the byte efficiency audits for the time being completely ignores response time and the fact that throughput changes over the duration of the request so the benefit you get from a CDN having low latency and quickly saturating the network wouldn't really be surfaced anyway. (Although that's a good idea for another audit, warning about just long server response times!)

Ideally, we'd have some fancy performance experimentation where we could cutoff downloading after X bytes have been received and really tell you what the savings would be

ebidel
ebidel previously requested changes Jan 26, 2017
@@ -26,6 +26,12 @@
.table_list tr:hover {
background-color: #fafafa;
}
.table_list td * {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perf isnt a huge deal on the report, but * is always nasty. Is there something more specific we can use? Right now it's just br, code blocks as children elements. RIght?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting what makes it so expensive? Reflows become a lot slower?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea. Selectors match right -> left. So * first matches every DOM node, then filters on tds, etc.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TIL :)

@@ -78,13 +78,18 @@ class Table extends Formatter {

const rows = results.map(result => {
const cols = headingKeys.map(key => {
let value = result[key];
if (typeof value === 'undefined') {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: !(key in value)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

doesn't let foo; 'foo' in {foo} === true?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure how many undefined vals we get, but will a bunch of '--' cells be too noisy?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found it super helpful for the CLI to differentiate between missing cells and (totally subjectively) more pleasing in the HTML, but open to other ideas

Copy link
Contributor

@ebidel ebidel Jan 26, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds like we might need some shared helpers for the code and empty cell formatting. Should we do that here or consolidate later?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure what you mean, which code needs to be shared?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://github.com/GoogleChrome/lighthouse/pull/1536/files#diff-508b9167df0e380cb39ba29ab7deafa2R98 for example. I could use that over in the flexbox audit to fix the whitespace formatting of the code snippet.

Copy link
Collaborator Author

@patrickhulce patrickhulce Jan 26, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh the fix for that is most likely the ~ added to the report Handlebars template, those lines were some more aggressive edge case hunting, but yes we totally should. I'm in favor of getting this PR for consistency between the byte efficiency audits through and then doing a bit of refactor cleanup of the stuff that's starting to be repeated everywhere, if that makes sense to you too?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense. I couldn't get ~ to work but I might be doing Handlebad.

case 'lineCol':
return `${result.line}:${result.col}`;
default:
return result[key];
return String(value);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

doesnt handlebars do this for us?

Copy link
Collaborator Author

@patrickhulce patrickhulce Jan 26, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, but the results also get directly passed to the pretty formatter above which is where startsWith handler fails with numbers

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ack

@@ -132,6 +132,7 @@ class ReportGenerator {
// XSS, define a renderer that only transforms links and code snippets.
// All other markdown ad HTML is ignored.
const renderer = new marked.Renderer();
renderer.br = () => '<br>';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are you using <br> anywhere?

Copy link
Collaborator Author

@patrickhulce patrickhulce Jan 26, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes <space><space>\n in between the url and the content preview for stylesheets

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah ok. I have a PR coming that introduces <pre> to the markdown renderer. We could also use that for block level code snippets, removing the need for <br> and the css you have. WDYT?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah just not sure how to appropriately target the pre tag then for margin only when its following other text. Also I can't get the pre to generate with triple ticks, it still does code for me after cherry-picking :/

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright I kinda like this, hacky but with a look of intention :)
image

@ebidel ebidel dismissed their stale review January 26, 2017 18:32

just comments

displayValue = `${Math.round(totalWastedBytes / KB_IN_BYTES)}KB potential savings`;
const totalWastedKb = Math.round(totalWastedBytes / KB_IN_BYTES);
const totalWastedMs = Math.round(totalWastedBytes / networkThroughput * 1000);
displayValue = `${totalWastedKb}KB (~${totalWastedMs}ms) potential savings`;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because this number is super hand-wavy let's reduce the precision some more. Shall we round to the nearest ten? e.g. 497 => 500 and 432 => 430.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure sg

@brendankenny brendankenny changed the title feat: byte efficiency time savings feat: byte efficiency estimates based on average throughput Jan 26, 2017
@patrickhulce
Copy link
Collaborator Author

I'm not getting off this easy am I? PTAL :)

static indexStylesheetsById(styles) {
static indexStylesheetsById(styles, networkRecords) {
const indexedNetworkRecords = networkRecords
.filter(record => /css/.test(record.mimeType))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

const firstRuleStart = preview.indexOf('{');
const firstRuleEnd = preview.indexOf('}');

if (firstRuleStart === -1 || firstRuleEnd === -1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there anything from https://github.com/ChromeDevTools/devtools-frontend/blob/b84fc38d5618d27b6a5b50d3948cb31a38488bfa/front_end/gonzales/SCSSParser.js we can reuse

not sure, just curious if we can leave some of the parsing/traversal to another project.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I tried to use these but because everything is expressed in terms of lines and columns translating to absolute length ended up being even more complicated than this (and didn't really handle leading comments such as those used in every lighthouse file :) ) It also focuses a lot more on rules than rule-sets which made for tricky and generally confusing logic to show the first rule-set with its selector.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also I would love to just totally kill this at some point in the future by attributing where the inline sheet came from and/or maybe showing a few of the selectors

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

k sgtm


let totalBytes = 0;
const timeBoundaries = networkRecords.reduce((boundaries, record) => {
if (/^data:/.test(record.url) || record.failed || !record.finished ||
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could use record.parsedURL().scheme here if you want.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yessss, this is what I've been looking for.

@paulirish paulirish merged commit aa4e711 into master Jan 27, 2017
@paulirish paulirish deleted the bytes_response_time branch January 27, 2017 22:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants