Skip to content

auditing and performance metrics for Progressive Web Apps

License

Notifications You must be signed in to change notification settings

sis-tools/lighthouse

 
 

Repository files navigation

lighthouse

Stops you crashing into the rocks; lights the way

image

image

Build Status Coverage Status

status: ready for use! please report any issues or questions you have

Lighthouse requires Chrome 52 or later

Install Chrome extension

chrome.google.com/webstore/detail/lighthouse/blipmdconlkpinefehnmjammfjpmpbjk

Quick-start guide on using the Lighthouse extension: http://bit.ly/lighthouse-quickstart

Install CLI

Requires Node v5+ or Node v4 w/ --harmony

npm install -g GoogleChrome/lighthouse

Run

# Kick off a lighthouse run
lighthouse https://airhorner.com/

# see flags and options
lighthouse --help

Develop

Setup

git clone https://github.com/GoogleChrome/lighthouse
cd lighthouse
npm install

Run

node lighthouse-cli http://example.com

Custom run configuration

You can supply your own run configuration to customize what audits you want details on. Copy the default.json and start customizing. Then provide to the CLI with lighthouse --config-path=myconfig.json <url>

Trace processing

Lighthouse can be used to analyze trace and performance data collected from other tools (like WebPageTest and ChromeDriver). The traces and performanceLog artifact items can be provided using a string for the absolute path on disk. The perf log is captured from the Network domain (a la ChromeDriver's enableNetwork option and reformatted slightly. As an example, here's a trace-only run that's reporting on user timings and critical request chains:

config.json
{
  "audits": [
    "user-timings",
    "critical-request-chains"
  ],

  "artifacts": {
    "traces": {
      "defaultPass": "/User/me/lighthouse/lighthouse-core/test/fixtures/traces/trace-user-timings.json"
    },
    "performanceLog": "/User/me/lighthouse/lighthouse-core/test/fixtures/traces/perflog.json"
  },

  "aggregations": [{
    "name": "Performance Metrics",
    "description": "These encapsulate your app's performance.",
    "scored": false,
    "categorizable": false,
    "items": [{
      "audits": {
        "user-timings": { "expectedValue": 0, "weight": 1 },
        "critical-request-chains": { "expectedValue": 0, "weight": 1}
      }
    }]
  }]
}

Then, run with: lighthouse --config-path=config.json http://www.random.url

Lighthouse CLI options

$ lighthouse --help

lighthouse <url>

Logging:
  --verbose  Displays verbose logging                                                 [boolean]
  --quiet    Displays no progress or debug logs                                       [boolean]

Configuration:
  --mobile                 Emulates a Nexus 5X                                  [default: true]
  --save-assets            Save the trace contents & screenshots to disk              [boolean]
  --save-artifacts         Save all gathered artifacts to disk                        [boolean]
  --list-all-audits        Prints a list of all available audits and exits            [boolean]
  --list-trace-categories  Prints a list of all required trace categories and exits   [boolean]
  --config-path            The path to the config JSON.
  --perf                   Use a performance-test-only configuration                  [boolean]

Output:
  --output       Reporter for the results
                         [choices: "pretty", "json", "html"]                [default: "pretty"]
  --output-path  The file path to output the results
                 Example: --output-path=./lighthouse-results.html           [default: "stdout"]

Options:
  --help             Show help                                                        [boolean]
  --version          Show version number                                              [boolean]
  --skip-autolaunch  Skip autolaunch of Chrome when accessing port 9222 fails         [boolean]
  --select-chrome    Choose Chrome location when multiple installations are found     [boolean]

Lighthouse w/ mobile devices

Lighthouse can run against a real mobile device. You can follow the Remote Debugging on Android (Legacy Workflow) up through step 3.3, but the TL;DR is install & run adb, enable USB debugging, then port forward 9222 from the device to the machine with Lighthouse.

$ adb kill-server

$ adb devices -l
* daemon not running. starting it now on port 5037 *
* daemon started successfully *
00a2fd8b1e631fcb       device usb:335682009X product:bullhead model:Nexus_5X device:bullhead

$ adb forward tcp:9222 localabstract:chrome_devtools_remote

$ lighthouse --mobile false https://mysite.com

Tests

Some basic unit tests forked are in /test and run via mocha. eslint is also checked for style violations.

# lint and test all files
npm test

# watch for file changes and run tests
#   Requires http://entrproject.org : brew install entr
npm run watch

## run linting and unit tests seprately
npm run lint
npm run unit

Chrome Extension

The same audits are run against from a Chrome extension. See ./extension.

Architecture

Some incomplete notes

Components

  • Driver - Interfaces with Chrome Debugging Protocol (API viewer)
  • Gathers - Requesting data from the browser (and maybe post-processing)
  • Artifacts - The output of gatherers
  • Audits - Non-performance evaluations of capabilities and issues. Includes a raw value and score of that value.
  • Metrics - Performance metrics summarizing the UX
  • Diagnoses - The perf problems that affect those metrics
  • Aggregators - Pulling audit results, grouping into user-facing components (eg. install_to_homescreen) and applying weighting and overall scoring.
Internal module graph

graph of lighthouse-core module dependencies npm install -g js-vd; vd --exclude "node_modules|third_party" lighthouse-core/ > graph.html

Protocol

  • Interacting with Chrome: The Chrome protocol connection maintained via chrome-remote-interface for the CLI and chrome.debuggger API when in the Chrome extension.
  • Event binding & domains: Some domains must be enable()d so they issue events. Once enabled, they flush any events that represent state. As such, network events will only issue after the domain is enabled. All the protocol agents resolve their Domain.enable() callback after they have flushed any pending events. See example:
// will NOT work
driver.sendCommand('Security.enable').then(_ => {
	driver.on('Security.securityStateChanged', state => { /* ... */ });
})

// WILL work! happy happy. :)
driver.on('Security.securityStateChanged', state => { /* ... */ }); // event binding is synchronous
driver.sendCommand('Security.enable');

Gatherers

  • Reading the DOM: We prefer reading the DOM right from the browser (See #77). The driver exposes a querySelector method that can be used along with a getAttribute method to read values.

Audits

The return value of each audit takes this shape:

Promise.resolve({
  name: 'audit-name',
  tags: ['what have you'],
  description: 'whatnot',
  // value: The score. Typically a boolean, but can be number 0-100
  value: 0,
  // rawValue: Could be anything, as long as it can easily be stringified and displayed,
  //   e.g. 'your score is bad because you wrote ${rawValue}'
  rawValue: {},
  // debugString: Some *specific* error string for helping the user figure out why they failed here.
  //   The reporter can handle *general* feedback on how to fix, e.g. links to the docs
  debugString: 'Your manifest 404ed'
  // fault:  Optional argument when the audit doesn't cover whatever it is you're doing,
  //   e.g. we can't parse your particular corner case out of a trace yet.
  //   Whatever is in `rawValue` and `score` would be N/A in these cases
  fault: 'some reason the audit has failed you, Anakin'
});

Code Style

The .eslintrc defines all.

We're using JSDoc along with closure annotations. Annotations encouraged for all contributions.

const > let > var. Use const wherever possible. Save var for emergencies only.

Trace processing

The traceviewer-based trace processor from node-big-rig was forked into Lighthouse. Additionally, the DevTools' Timeline Model is available as well. There may be advantages for using one model over another.

To update traceviewer source:

cd lighthouse-core
# if not already there, clone catapult and copy license over
git clone --depth=1 https://github.com/catapult-project/catapult.git third_party/src/catapult
cp third_party/src/catapult/LICENSE third_party/traceviewer-js/
# pull for latest
git -C "./third_party/src/catapult/" pull
# run our conversion script
node scripts/build-traceviewer-module.js

About

auditing and performance metrics for Progressive Web Apps

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • JavaScript 92.9%
  • HTML 3.6%
  • CSS 2.9%
  • Shell 0.6%