Skip to content

Latest commit

 

History

History
119 lines (100 loc) · 3.62 KB

File metadata and controls

119 lines (100 loc) · 3.62 KB

Lighthouse CI Usage

To monitor Lighthouse scores and prevent unexpected ad performance regressions, you can integrate Lighthouse CI (LHCI) into your existing Continuous Integration (CI) workflow.

Lighthouse CI is a suite of tools that make continuously running, saving, retrieving, and asserting against Lighthouse results as easy as possible.

Today, this tool is compatible with GitHub Actions, Travis CI, Circle CI, GitLab CI, Jenkins, and Google Cloudbuild.

If you haven't used LHCI before, please familiarize yourself with their Getting Started documentation prior to starting.

Config

LHCI natively supports plugins, including Publisher Ads Audits. To run Publisher Ads Audits in your LHCI reports, simply set the settings.plugins attribute in the lighthouserc.js collect config.

Example:

module.exports = {
  ci: {
    // ...
    collect: {
      settings: {
        plugins: ['lighthouse-plugin-publisher-ads'],
      },
    },
    // ...
  }

Additionally, you must ensure that lighthouse-plugins-publisher-ads is installed in your CI environment. The can be done by adding the line npm install -g lighthouse-plugin-publisher-ads@1.3.x to your CI build rule.

We recommend setting a specific patch version (ex: 1.3.1) to ensure that unexpected regressions are not due to changes in the plugin.

Assertions

Using the assertion framework, you can ensure that scores and metrics remain within a desired threshold.

See official documentation to learn more.

A simple setup could assert that the Publisher Ads Audits category passes with a 100% score.

Example:

module.exports = {
  ci: {
    // ...
    assert: {
      assertions: {
        'categories:lighthouse-plugin-publisher-ads': [
          'error',
          {'minScore': 1}, // No failing plugin audits.
        ],
      }
    },
    // ...
  }

With the above example, you ensure that all audits pass with a final score of 100%. If a code change ever reduces the score, this setup will catch it and stop it from being merged.

While the above example may be sufficient for your use case, lower level assertions will be beneficial for ensuring consistent performance across the board, especially in cases where a perfect score is not expected.

Example:

module.exports = {
  ci: {
    // ...
    assert: {
      assertions: {
        'categories:lighthouse-plugin-publisher-ads': [
          'error',
          {'minScore': .8}, // Category score >= 80%.
        ],
        'first-ad-render': [
          'error',
          {
            'maxNumericValue': 10000, // <= 10s
            'minScore': 1 // Always passes audit
          },
        ],
        'tag-load-time': [
          'error',
          {'minScore': .9} // Audit score >= 90%
        ],
        'viewport-ad-density': [
          'error',
          {'maxNumericValue: .3} // Ad density <= 30%
        ]
      }
    },
    // ...
  }

Example

There is an example LHCI configuration in this repository that checks 2 mock pages. It utilizes GitHub Actions and the Lighthouse CI GitHub app to demonstrate a simple setup. However, the same assertions can be integrated with many other popular CI providers.