Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat!: Run plugin on live deploy URL by default #588

Merged
merged 9 commits into from
Jul 13, 2023
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
80 changes: 42 additions & 38 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,32 +73,25 @@ The lighthouse scores are automatically printed to the **Deploy log** in the Net

To customize how Lighthouse runs audits, you can make changes to the `netlify.toml` file.

By default, the plugin will serve and audit the build directory of the site, inspecting the `index.html`.
You can customize the behavior via the `audits` input:
By default, the plugin will run after your build is deployed on the live deploy permalink, inspecting the home path `/`.
You can add additional configuration and/or inspect a different path, or multiple additional paths by adding configuration in the `netlify.toml` file:

```toml
[[plugins]]
package = "@netlify/plugin-lighthouse"

# Set minimum thresholds for each report area
[plugins.inputs.thresholds]
performance = 0.9

# to audit a sub path of the build directory
# to audit a path other than /
# route1 audit will use the top level thresholds
[[plugins.inputs.audits]]
path = "route1"

# you can specify output_path per audit, relative to the path
# you can optionally specify an output_path per audit, relative to the path, where HTML report output will be saved
output_path = "reports/route1.html"

# to audit an HTML file other than index.html in the build directory
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A couple of options are specific only to the onPostBuild behaviour (e.g. which directory to serve, file names other than index.html) so have been moved to the new/separate section

[[plugins.inputs.audits]]
path = "contact.html"

# to audit an HTML file other than index.html in a sub path of the build directory
[[plugins.inputs.audits]]
path = "pages/contact.html"

# to audit a specific absolute url
[[plugins.inputs.audits]]
url = "https://www.example.com"
Expand All @@ -107,11 +100,42 @@ You can customize the behavior via the `audits` input:
[plugins.inputs.audits.thresholds]
performance = 0.8

```

#### Fail a deploy based on score thresholds

By default, the lighthouse plugin will run _after_ your deploy has been successful, auditing the live deploy content.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it worth adding a note about the limitations, e.g. the site needs to be pre-built – no SSR etc

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great callout! I've added a sentence that uses similar wording to what we currently have on the docs page 👍

To run the plugin _before_ the deploy is live, use the `fail_deploy_on_score_thresholds` input to instead run during the `onPostBuild` event.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Earlier on the page, under "The lighthouse scores are automatically printed to the Deploy log in the Netlify UI. For example:" the example deploy log output shows the plugin running at the onPostBuild stage

Screenshot 2023-07-11 at 2 51 29 PM

Since this is no longer the default behavior, and this option isn't introduced until after the deploy log example, it could be helpful to update the deploy log example to be more representative of what folks get by default.

This will statically serve your build output folder, and audit the `index.html` (or other file if specified as below). Please note that sites or site paths using SSR/ISR (server-side rendering or Incremental Static Regeneration) cannot be served and audited in this way.

Using this configuration, if minimum threshold scores are supplied and not met, the deploy will fail. Set the threshold based on `performance`, `accessibility`, `best-practices`, `seo`, or `pwa`.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

earlier on the page, the code sample under "Then add the plugin to your netlify.toml configuration file:" shows thresholds being configured without fail_deploy_on_score_thresholds = "true"

Screenshot 2023-07-11 at 2 55 55 PM

Since the thresholds won't do anything without fail_deploy_on_score_thresholds = "true", I think it would be good to remove them from that initial toml sample


```toml
[[plugins]]
package = "@netlify/plugin-lighthouse"

# Set the plugin to run prior to deploy, failing the build if minimum thresholds aren't set
[plugins.inputs]
fail_deploy_on_score_thresholds = "true"

# Set minimum thresholds for each report area
[plugins.inputs.thresholds]
performance = 0.9
accessibility: = 0.7

# to audit an HTML file other than index.html in the build directory
[[plugins.inputs.audits]]
path = "contact.html"

# to audit an HTML file other than index.html in a sub path of the build directory
[[plugins.inputs.audits]]
path = "pages/contact.html"

# to serve only a sub directory of the build directory for an audit
# pages/index.html will be audited, and files outside of this directory will not be served
[[plugins.inputs.audits]]
serveDir = "pages"

```

### Run Lighthouse audits for desktop
Expand Down Expand Up @@ -148,18 +172,6 @@ Updates to `netlify.toml` will take effect for new builds.
locale = "es" # generates Lighthouse reports in Español
```

### Fail Builds Based on Score Thresholds

By default, the Lighthouse plugin will report the findings in the deploy logs. To fail a build based on a specific score, specify the inputs thresholds in your `netlify.toml` file. Set the threshold based on `performance`, `accessibility`, `best-practices`, `seo`, or `pwa`.

```toml
[[plugins]]
package = "@netlify/plugin-lighthouse"

[plugins.inputs.thresholds]
performance = 0.9
```

### Run Lighthouse Locally

Fork and clone this repo.
Expand All @@ -173,21 +185,13 @@ yarn local

## Preview Lighthouse results within the Netlify UI

Netlify offers an experimental feature through Netlify Labs that allows you to view Lighthouse scores for each of your builds on your site's Deploy Details page with a much richer format.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This section was incorrectly referencing the feature being in Labs, seems it got missed in a previous cleanup. I've also added some new screenshots to bring them a bit more up to date


You'll need to install the [Lighthouse build plugin](https://app.netlify.com/plugins/@netlify/plugin-lighthouse/install) on your site and then enable this experimental feature through Netlify Labs.

<img width="1400" alt="Deploy view with Lighthouse visualizations" src="https://user-images.githubusercontent.com/79875905/160019039-c3e529de-f389-42bc-a3d4-458c90d59e6a.png">

If you have multiple audits (directories, paths, etc) defined in your build, we will display a roll-up of the average Lighthouse scores for all the current build's audits plus the results for each individual audit.
The Netlify UI allows you to view Lighthouse scores for each of your builds on your site's Deploy Details page with a much richer format.

<img width="1400" alt="Deploy details with multiple audit Lighthouse results" src="https://user-images.githubusercontent.com/79875905/160019057-d29dffab-49f3-4fbf-a1ac-1f314e0cd837.png">
You'll need to first install the [Lighthouse build plugin](https://app.netlify.com/plugins/@netlify/plugin-lighthouse/install) on your site.

Some items of note:
<img width="1400" alt="Deploy view with Lighthouse visualizations" src="https://github.com/netlify/netlify-plugin-lighthouse/assets/20773163/144d7bd3-5b7b-4a18-826e-c8d582f92fab">

- The [Lighthouse Build Plugin](https://app.netlify.com/plugins/@netlify/plugin-lighthouse/install) must be installed on your site(s) in order for these score visualizations to be displayed.
- This Labs feature is currently only enabled at the user-level, so it will need to be enabled for each individual team member that wishes to see the Lighthouse scores displayed.
If you have multiple audits (e.g. multiple paths) defined in your build, we will display a roll-up of the average Lighthouse scores for all the current build's audits plus the results for each individual audit.

Learn more in our official [Labs docs](https://docs.netlify.com/netlify-labs/experimental-features/lighthouse-visualization/).
<img width="1400" alt="Deploy details with multiple audit Lighthouse results" src="https://github.com/netlify/netlify-plugin-lighthouse/assets/20773163/b9887c64-db03-40c0-b7e9-5acba081f87b">

We have a lot planned for this feature and will be adding functionality regularly, but we'd also love to hear your thoughts. Please [share your feedback](https://netlify.qualtrics.com/jfe/form/SV_1NTbTSpvEi0UzWe) about this experimental feature and tell us what you think.
4 changes: 2 additions & 2 deletions manifest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,6 @@ inputs:
required: false
description: Lighthouse-specific settings, used to modify reporting criteria

- name: run_on_success
- name: fail_deploy_on_score_thresholds
required: false
description: (Beta) Run Lighthouse against the deployed site
description: Fail deploy if minimum threshold scores are not met
3 changes: 3 additions & 0 deletions netlify.toml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,9 @@ package = "./src/index.js"
[plugins.inputs]
output_path = "reports/lighthouse.html"

# Note: Required for our Cypress smoke tests
fail_deploy_on_score_thresholds = "true"

[plugins.inputs.thresholds]
performance = 0.9

Expand Down
2 changes: 1 addition & 1 deletion package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@
"description": "Netlify Plugin to run Lighthouse on each build",
"main": "src/index.js",
"scripts": {
"local": "node -e 'import(\"./src/index.js\").then(index => index.default()).then(events => events.onPostBuild());'",
"local-onsuccess": "LIGHTHOUSE_RUN_ON_SUCCESS=true node -e 'import(\"./src/index.js\").then(index => index.default()).then(events => events.onSuccess());'",
"local": "node -e 'import(\"./src/index.js\").then(index => index.default()).then(events => events.onSuccess());'",
"local-onpostbuild": "node -e 'import(\"./src/index.js\").then(index => index.default({fail_deploy_on_score_thresholds: \"true\"})).then(events => events.onPostBuild());'",
"lint": "eslint 'src/**/*.js'",
"format": "prettier --write 'src/**/*.js'",
"test": "node --experimental-vm-modules node_modules/jest/bin/jest.js --collect-coverage --maxWorkers=1",
Expand Down
16 changes: 12 additions & 4 deletions src/e2e/fail-threshold-onpostbuild.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -45,12 +45,16 @@ describe('lighthousePlugin with failed threshold run (onPostBuild)', () => {
'- PWA: 30',
];

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin({
fail_deploy_on_score_thresholds: 'true',
}).onPostBuild({ utils: mockUtils });
expect(formatMockLog(console.log.mock.calls)).toEqual(logs);
});

it('should not output expected success payload', async () => {
await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin({
fail_deploy_on_score_thresholds: 'true',
}).onPostBuild({ utils: mockUtils });
expect(mockUtils.status.show).not.toHaveBeenCalledWith();
});

Expand All @@ -67,7 +71,9 @@ describe('lighthousePlugin with failed threshold run (onPostBuild)', () => {
" 'Manifest doesn't have a maskable icon' received a score of 0",
];

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin({
fail_deploy_on_score_thresholds: 'true',
}).onPostBuild({ utils: mockUtils });
const resultError = console.error.mock.calls[0][0];
expect(stripAnsi(resultError).split('\n').filter(Boolean)).toEqual(error);
});
Expand Down Expand Up @@ -99,7 +105,9 @@ describe('lighthousePlugin with failed threshold run (onPostBuild)', () => {
],
};

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin({
fail_deploy_on_score_thresholds: 'true',
}).onPostBuild({ utils: mockUtils });
const [resultMessage, resultPayload] =
mockUtils.build.failBuild.mock.calls[0];

Expand Down
1 change: 0 additions & 1 deletion src/e2e/fail-threshold-onsuccess.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@ describe('lighthousePlugin with failed threshold run (onSuccess)', () => {
beforeEach(() => {
resetEnv();
jest.clearAllMocks();
process.env.LIGHTHOUSE_RUN_ON_SUCCESS = 'true';
process.env.DEPLOY_URL = 'https://www.netlify.com';
process.env.THRESHOLDS = JSON.stringify({
performance: 1,
Expand Down
1 change: 0 additions & 1 deletion src/e2e/lib/reset-env.js
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
const resetEnv = () => {
delete process.env.OUTPUT_PATH;
delete process.env.PUBLISH_DIR;
delete process.env.RUN_ON_SUCCESS;
delete process.env.SETTINGS;
delete process.env.THRESHOLDS;
delete process.env.URL;
Expand Down
12 changes: 9 additions & 3 deletions src/e2e/not-found-onpostbuild.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,9 @@ describe('lighthousePlugin with single not-found run (onPostBuild)', () => {
'Lighthouse was unable to reliably load the page you requested. Make sure you are testing the correct URL and that the server is properly responding to all requests. (Status code: 404)',
];

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin({
fail_deploy_on_score_thresholds: 'true',
}).onPostBuild({ utils: mockUtils });
expect(formatMockLog(console.log.mock.calls)).toEqual(logs);
});

Expand All @@ -52,14 +54,18 @@ describe('lighthousePlugin with single not-found run (onPostBuild)', () => {
"Error testing 'example/this-page-does-not-exist': Lighthouse was unable to reliably load the page you requested. Make sure you are testing the correct URL and that the server is properly responding to all requests. (Status code: 404)",
};

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin({
fail_deploy_on_score_thresholds: 'true',
}).onPostBuild({ utils: mockUtils });
expect(mockUtils.status.show).toHaveBeenCalledWith(payload);
});

it('should not output errors, or call fail events', async () => {
mockConsoleError();

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin({
fail_deploy_on_score_thresholds: 'true',
}).onPostBuild({ utils: mockUtils });
expect(console.error).not.toHaveBeenCalled();
expect(mockUtils.build.failBuild).not.toHaveBeenCalled();
expect(mockUtils.build.failPlugin).not.toHaveBeenCalled();
Expand Down
1 change: 0 additions & 1 deletion src/e2e/not-found-onsuccess.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@ describe('lighthousePlugin with single not-found run (onSuccess)', () => {
beforeEach(() => {
resetEnv();
jest.clearAllMocks();
process.env.LIGHTHOUSE_RUN_ON_SUCCESS = 'true';
process.env.DEPLOY_URL = 'https://www.netlify.com';
process.env.AUDITS = JSON.stringify([{ path: 'this-page-does-not-exist' }]);
});
Expand Down
16 changes: 8 additions & 8 deletions src/e2e/settings-locale.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -30,22 +30,22 @@ describe('lighthousePlugin with custom locale', () => {
jest.clearAllMocks();
process.env.PUBLISH_DIR = 'example';
process.env.SETTINGS = JSON.stringify({ locale: 'es' });
process.env.DEPLOY_URL = 'https://www.netlify.com';
});

it('should output expected log content', async () => {
const logs = [
'Generating Lighthouse report. This may take a minute…',
'Running Lighthouse on example/ using the “es” locale',
'Serving and scanning site from directory example',
'Lighthouse scores for example/',
'Running Lighthouse on / using the “es” locale',
'Lighthouse scores for /',
'- Rendimiento: 100',
'- Accesibilidad: 100',
'- Prácticas recomendadas: 100',
'- SEO: 91',
'- PWA: 30',
];

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin().onSuccess({ utils: mockUtils });
expect(formatMockLog(console.log.mock.calls)).toEqual(logs);
});

Expand All @@ -58,7 +58,7 @@ describe('lighthousePlugin with custom locale', () => {
installable: false,
locale: 'es',
},
path: 'example/',
path: '/',
report: '<!DOCTYPE html><h1>Lighthouse Report (mock)</h1>',
summary: {
accessibility: 100,
Expand All @@ -70,17 +70,17 @@ describe('lighthousePlugin with custom locale', () => {
},
],
summary:
"Summary for path 'example/': Rendimiento: 100, Accesibilidad: 100, Prácticas recomendadas: 100, SEO: 91, PWA: 30",
"Summary for path '/': Rendimiento: 100, Accesibilidad: 100, Prácticas recomendadas: 100, SEO: 91, PWA: 30",
};

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin().onSuccess({ utils: mockUtils });
expect(mockUtils.status.show).toHaveBeenCalledWith(payload);
});

it('should not output errors, or call fail events', async () => {
mockConsoleError();

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin().onSuccess({ utils: mockUtils });
expect(console.error).not.toHaveBeenCalled();
expect(mockUtils.build.failBuild).not.toHaveBeenCalled();
expect(mockUtils.build.failPlugin).not.toHaveBeenCalled();
Expand Down
16 changes: 8 additions & 8 deletions src/e2e/settings-preset.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -24,21 +24,21 @@ describe('lighthousePlugin with custom device preset', () => {
jest.clearAllMocks();
process.env.PUBLISH_DIR = 'example';
process.env.SETTINGS = JSON.stringify({ preset: 'desktop' });
process.env.DEPLOY_URL = 'https://www.netlify.com';
});

it('should output expected log content', async () => {
const logs = [
'Generating Lighthouse report. This may take a minute…',
'Running Lighthouse on example/ using the “desktop” preset',
'Serving and scanning site from directory example',
'Lighthouse scores for example/',
'Running Lighthouse on / using the “desktop” preset',
'Lighthouse scores for /',
'- Performance: 100',
'- Accessibility: 100',
'- Best Practices: 100',
'- SEO: 91',
'- PWA: 30',
];
await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin().onSuccess({ utils: mockUtils });
expect(formatMockLog(console.log.mock.calls)).toEqual(logs);
});

Expand All @@ -51,7 +51,7 @@ describe('lighthousePlugin with custom device preset', () => {
installable: false,
locale: 'en-US',
},
path: 'example/',
path: '/',
report: '<!DOCTYPE html><h1>Lighthouse Report (mock)</h1>',
summary: {
accessibility: 100,
Expand All @@ -63,17 +63,17 @@ describe('lighthousePlugin with custom device preset', () => {
},
],
summary:
"Summary for path 'example/': Performance: 100, Accessibility: 100, Best Practices: 100, SEO: 91, PWA: 30",
"Summary for path '/': Performance: 100, Accessibility: 100, Best Practices: 100, SEO: 91, PWA: 30",
};

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin().onSuccess({ utils: mockUtils });
expect(mockUtils.status.show).toHaveBeenCalledWith(payload);
});

it('should not output errors, or call fail events', async () => {
mockConsoleError();

await lighthousePlugin().onPostBuild({ utils: mockUtils });
await lighthousePlugin().onSuccess({ utils: mockUtils });
expect(console.error).not.toHaveBeenCalled();
expect(mockUtils.build.failBuild).not.toHaveBeenCalled();
expect(mockUtils.build.failPlugin).not.toHaveBeenCalled();
Expand Down
Loading
Loading