Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core: handle invalid network timing data #6780

Merged
merged 2 commits into from
Dec 13, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 10 additions & 1 deletion lighthouse-core/computed/page-dependency-graph.js
Original file line number Diff line number Diff line change
Expand Up @@ -269,8 +269,11 @@ class PageDependencyGraph {
const networkNodeOutput = PageDependencyGraph.getNetworkNodeOutput(networkRecords);
const cpuNodes = PageDependencyGraph.getCPUNodes(traceOfTab);

const rootRequest = networkRecords.reduce((min, r) => (min.startTime < r.startTime ? min : r));
// The root request is the earliest network request, using position in networkRecords array to break ties.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

best comment

const rootRequest = networkRecords.reduce((min, r) => (r.startTime < min.startTime ? r : min));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

might be overkill, but should we pull this out into another method on NetworkAnalyzer like findMainDocument? I can see it usable by others (if they remember it's there).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd want to shift consumers to be using the graph as the source of truth, so hopefully no one else needs to compute this too? :)

const rootNode = networkNodeOutput.idToNodeMap.get(rootRequest.requestId);
// The main document request is the earliest network request *of type document*.
// This will be different from the root request when there are server redirects.
const mainDocumentRequest = NetworkAnalyzer.findMainDocument(networkRecords);
const mainDocumentNode = networkNodeOutput.idToNodeMap.get(mainDocumentRequest.requestId);

Expand All @@ -279,6 +282,12 @@ class PageDependencyGraph {
throw new Error(`${rootNode ? 'mainDocument' : 'root'}Node not found.`);
}

if (mainDocumentNode !== rootNode &&
(!mainDocumentNode.record.redirects ||
!mainDocumentNode.record.redirects.includes(rootNode.record))) {
throw new Error('Root node was not in redirect chain of mainDocument');
}

PageDependencyGraph.linkNetworkNodes(rootNode, networkNodeOutput);
PageDependencyGraph.linkCPUNodes(rootNode, networkNodeOutput, cpuNodes);
mainDocumentNode.setIsMainDocument(true);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -451,7 +451,8 @@ class NetworkAnalyzer {
// TODO(phulce): handle more edge cases like client redirects, or plumb through finalUrl
const documentRequests = records.filter(record => record.resourceType ===
NetworkRequest.TYPES.Document);
return documentRequests.sort((a, b) => a.startTime - b.startTime)[0];
// The main document is the earliest document request, using position in networkRecords array to break ties.
return documentRequests.reduce((min, r) => (r.startTime < min.startTime ? r : min));
}
}

Expand Down
3 changes: 3 additions & 0 deletions lighthouse-core/lib/network-request.js
Original file line number Diff line number Diff line change
Expand Up @@ -272,6 +272,9 @@ module.exports = class NetworkRequest {
* @param {LH.Crdp.Network.ResourceTiming} timing
*/
_recomputeTimesWithResourceTiming(timing) {
// Don't recompute times if the data is invalid. RequestTime should always be a thread timestamp.
// If we don't have receiveHeadersEnd, we really don't have more accurate data.
if (timing.requestTime === 0 || timing.receiveHeadersEnd === -1) return;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ha, I believe this would have saved me a lot of time figuring out how to generate the right network request events in #6171

// Take startTime and responseReceivedTime from timing data for better accuracy.
// Timing's requestTime is a baseline in seconds, rest of the numbers there are ticks in millis.
this.startTime = timing.requestTime;
Expand Down
24 changes: 20 additions & 4 deletions lighthouse-core/test/computed/page-dependency-graph-test.js
Original file line number Diff line number Diff line change
Expand Up @@ -261,19 +261,35 @@ describe('PageDependencyGraph computed artifact:', () => {
});

it('should set isMainDocument on first document request', () => {
const request1 = createRequest(1, '1', 0, null, NetworkRequest.TYPES.Image);
const request2 = createRequest(2, '2', 5);
const networkRecords = [request1, request2];
const request1 = createRequest(1, '1', 0, null, NetworkRequest.TYPES.Other);
const request2 = createRequest(2, '2', 5, null, NetworkRequest.TYPES.Document);
// Add in another unrelated + early request to make sure we pick the correct chain
const request3 = createRequest(3, '3', 0, null, NetworkRequest.TYPES.Other);
request2.redirects = [request1];
const networkRecords = [request1, request2, request3];

addTaskEvents(0, 0, []);

const graph = PageDependencyGraph.createGraph(traceOfTab, networkRecords);
const nodes = [];
graph.traverse(node => nodes.push(node));

assert.equal(nodes.length, 2);
assert.equal(nodes.length, 3);
assert.equal(nodes[0].id, 1);
assert.equal(nodes[0].isMainDocument(), false);
assert.equal(nodes[1].isMainDocument(), true);
assert.equal(nodes[2].isMainDocument(), false);
});

it('should throw when root node is not related to main document', () => {
const request1 = createRequest(1, '1', 0, null, NetworkRequest.TYPES.Other);
const request2 = createRequest(2, '2', 5, null, NetworkRequest.TYPES.Document);
const networkRecords = [request1, request2];

addTaskEvents(0, 0, []);

const fn = () => PageDependencyGraph.createGraph(traceOfTab, networkRecords);
expect(fn).toThrow(/root node.*document/i);
});
});
});
Original file line number Diff line number Diff line change
Expand Up @@ -374,5 +374,16 @@ describe('DependencyGraph/Simulator/NetworkAnalyzer', () => {
const mainDocument = NetworkAnalyzer.findMainDocument(records);
assert.equal(mainDocument.url, 'https://pwa.rocks/');
});

it('should break ties using position in array', async () => {
const records = [
{url: 'http://example.com', resourceType: 'Other'},
{url: 'https://example.com', resourceType: 'Other'},
{url: 'https://www.example.com', resourceType: 'Document', startTime: 0},
{url: 'https://www.iframe.com', resourceType: 'Document', startTime: 0},
];
const mainDocument = NetworkAnalyzer.findMainDocument(records);
assert.equal(mainDocument.url, 'https://www.example.com');
});
});
});
14 changes: 13 additions & 1 deletion lighthouse-core/test/lib/network-recorder-test.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
'use strict';

const NetworkRecorder = require('../../lib/network-recorder');
const networkRecordsToDevtoolsLog = require('../network-records-to-devtools-log.js');
const assert = require('assert');
const devtoolsLogItems = require('../fixtures/artifacts/perflog/defaultPass.devtoolslog.json');
const redirectsDevtoolsLog = require('../fixtures/wikipedia-redirect.devtoolslog.json');
Expand Down Expand Up @@ -45,7 +46,7 @@ describe('network recorder', function() {
assert.equal(mainDocument.resourceType, 'Document');
});

it('recordsFromLogs ignores invalid records', function() {
it('recordsFromLogs ignores records with an invalid URL', function() {
const logs = [
{ // valid request
'method': 'Network.requestWillBeSent',
Expand All @@ -55,6 +56,7 @@ describe('network recorder', function() {
'loaderId': '1',
'documentURL': 'https://www.example.com',
'request': {
// This URL is valid
'url': 'https://www.example.com',
'method': 'GET',
'headers': {
Expand All @@ -80,6 +82,7 @@ describe('network recorder', function() {
'loaderId': '2',
'documentURL': 'https://www.example.com',
'request': {
// This URL is invalid
'url': 'https:',
'method': 'GET',
'headers': {
Expand All @@ -105,6 +108,15 @@ describe('network recorder', function() {
assert.equal(records.length, 1);
});

it('should ignore invalid `timing` data', () => {
const inputRecords = [{url: 'http://example.com', startTime: 1, endTime: 2}];
const devtoolsLogs = networkRecordsToDevtoolsLog(inputRecords);
const responseReceived = devtoolsLogs.find(item => item.method === 'Network.responseReceived');
responseReceived.params.response.timing = {requestTime: 0, receiveHeadersEnd: -1};
const records = NetworkRecorder.recordsFromLogs(devtoolsLogs);
expect(records).toMatchObject([{url: 'http://example.com', startTime: 1, endTime: 2}]);
});

describe('#findNetworkQuietPeriods', () => {
function record(data) {
const url = data.url || 'https://example.com';
Expand Down