Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lighthouse metrics report in CI #8630

Merged
merged 13 commits into from
Jan 26, 2022

Conversation

ferblape
Copy link
Contributor

@ferblape ferblape commented Dec 20, 2021

🎩 What? Why?

This PR setups a new workflow in Github Actions CI to run a Lighthouse report against:

  • homepage
  • a participatory process page
  • a meeting page
  • a proposal page

In order to run the report in a controlled environment, the report is executed in a rails server running on the same Github Action. This server executes a development app with seeds.

Because seeds are random, a Rake task has been included to add dynamically the urls to lighthouse configuration file.

About the reports

There are two ways to check the reports:

About the performance

The report numbers are worse that in a production application running in a optimized hardware (such as Heroku or Linode), but I think it's useful to set a baseline.

Based on these numbers we could create some expectations and make the Github Action check to fail if the performance has been downgraded.

📌 Related Issues

Acceptance criteria

  • Given that I'm a Decidim developer
    When I go to GitHub Actions tab in GH,
    Then I have a Lighthouse check.
  • Regarding the data I have access in the check, I can see ...
    • ... First Contentful Paint of homepage
    • ... First Contentful Paint of a participatory process
    • ... First Contentful Paint of a proposal
    • ... First Contentful Paint of a meeting
    • ... Speed Index of homepage
    • ... Speed Index of a participatory process
    • ... Speed Index of a proposal
    • ... Speed Index of a meeting
    • ... Time to Interactive of homepage
    • ... Time to Interactive of a participatory process
    • ... Time to Interactive of a proposal
    • ... Time to Interactive of a meeting
    • ... Largest Contentful Paint of homepage
    • ... Largest Contentful Paint of a participatory process
    • ... Largest Contentful Paint of a proposal
    • ... Largest Contentful Paint of a meeting

Testing

Just review the results of the Github Action and check the reports.

📋 Checklist

🚨 Please review the guidelines for contributing to this repository.

  • CONSIDER adding a unit test if your PR resolves an issue.
  • ✔️ DO check open PR's to avoid duplicates.
  • ✔️ DO keep pull requests small so they can be easily reviewed.
  • ✔️ DO build locally before pushing.
  • ✔️ DO make sure tests pass.
  • ✔️ DO make sure any new changes are documented in docs/.
  • ✔️ DO add and modify seeds if necessary.
  • ✔️ DO add CHANGELOG upgrade notes if required.
  • ✔️ DO add to GraphQL API if there are new public fields.
  • ✔️ DO add link to MetaDecidim if it's a new feature.
  • AVOID breaking the continuous integration build.
  • AVOID making significant changes to the overall architecture.

📷 Screenshots

Please add screenshots of the changes you're proposing
Description

♥️ Thank you!

@ferblape ferblape force-pushed the feature/time-monitoring-metrics-ci-8470 branch 4 times, most recently from 1251f6c to c2f9216 Compare December 23, 2021 04:18
@ferblape ferblape force-pushed the feature/time-monitoring-metrics-ci-8470 branch 5 times, most recently from 7c6aa8b to 6e4da4b Compare January 11, 2022 05:48
@ferblape ferblape changed the title Lighthouse metrics WIP Lighthouse metrics report Jan 11, 2022
@ferblape
Copy link
Contributor Author

@andreslucena this is a first implementation, I'd love your feedback here.

@ferblape ferblape marked this pull request as ready for review January 11, 2022 06:30
@andreslucena
Copy link
Member

This is a great first step indeed!! 😍

The hard problem of "running Lighthouse against the current branch" is done, congratulations 😄

The only doubts that I have are mostly about the Acceptance Criteria and the methodology for the rest of the Epic (#8495). It seems like I obviously skipped some things when writing them. Just making a brain dump, so we can later make a decision about the best course of action:

  1. The ACs for this issue (Configure time monitoring metrics at CI #8470) are treating GH Actions like it was a Monitoring platform (like Nagios, Zabbix, etc), but it's a CI, so the actions should only return green/red, and we couldn't have the historic data as that just doesn't make sense, and even if we could, it'd be probably a path of misery and pain.
  2. When we have the performance metrics done in this branch, they'd be red for sure, so if we merge this to develop, we will have it failing on every single PR until all the Epic is done. I think that we should:
    1. Finish the work on this current PR (like extracting metrics for every page, red/green logic, etc)
    2. Create a new branch, called chore/performance-improvements. Merge this PR against this branch.
    3. Start working in the rest of the performance improvements of the rest of the Epic. Merge against the chore/performance-improvements. This branch would be the one for validating all the developments.
    4. Once we have green the "performance metric" action, then we can merge with develop, so with this action we can catch regressions.

Maybe in iii we could also merge with develop, so we can test locally in more environments, or there's another way of tackling this that's cleaner. What do you think @ferblape?

@ferblape
Copy link
Contributor Author

Thanks @andreslucena

About (1), yes, it's a bit killer 🗡️ to have historical metrics of the last 90 days. I propose to skip that AC item

About (2), I like your proposal, I think we can do it in that way.

I have activated the assertions following the metrics you proposed in the other issues and we have a red check

@ferblape ferblape force-pushed the feature/time-monitoring-metrics-ci-8470 branch 2 times, most recently from d792c63 to adf1123 Compare January 12, 2022 04:37
@ferblape ferblape changed the base branch from develop to chore/performance-improvements January 12, 2022 04:38
@ferblape
Copy link
Contributor Author

@andreslucena I've changed the base of the PR, in case you want to review and merge.

@ferblape ferblape force-pushed the feature/time-monitoring-metrics-ci-8470 branch from adf1123 to db346f7 Compare January 13, 2022 04:09
@andreslucena andreslucena added team: performance type: internal PRs that aren't necessary to add to the CHANGELOG for implementers labels Jan 14, 2022
Copy link
Member

@andreslucena andreslucena left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think in this case, we can drop some comments that are unnecessary. How do you see it?

.github/workflows/ci_performance_metrics_monitoring.yml Outdated Show resolved Hide resolved
.github/workflows/ci_performance_metrics_monitoring.yml Outdated Show resolved Hide resolved
decidim-dev/lib/tasks/lighthouse_report.rake Outdated Show resolved Hide resolved
decidim-dev/lib/tasks/lighthouse_report.rake Outdated Show resolved Hide resolved
decidim-dev/lib/tasks/lighthouse_report.rake Outdated Show resolved Hide resolved
decidim-dev/lib/tasks/lighthouse_report.rake Outdated Show resolved Hide resolved
decidim-dev/lib/tasks/lighthouse_report.rake Outdated Show resolved Hide resolved
decidim-dev/lib/tasks/lighthouse_report.rake Show resolved Hide resolved
decidim-dev/lib/tasks/lighthouse_report.rake Outdated Show resolved Hide resolved
@andreslucena
Copy link
Member

About (1), yes, it's a bit killer dagger to have historical metrics of the last 90 days. I propose to skip that AC item

While seeing the doc for the action that we're using, I found out this: https://github.com/GoogleChrome/lighthouse-ci

But I agree that at the moment this is overkill, maybe in the future with the Decidim Association if we have time and see it necessary we could install it in our servers.

About (2), I like your proposal, I think we can do it in that way.

I have activated the assertions following the metrics you proposed in the other issues and we have a red check

Awesome 👏🏽 👏🏽

@andreslucena andreslucena added the contract: PWA Barcelona City Council contract label Jan 14, 2022
ferblape and others added 2 commits January 15, 2022 07:56
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
ferblape and others added 6 commits January 15, 2022 07:56
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
@andreslucena
Copy link
Member

@ferblape I'm working in the AC's, as I've mentioned in my comments.

While reviewing that, I think I found a bug with the current implementation. I think the speed-index budget isn't working as expected: in the Homepage it should be an error, but I don't see it in the CI.

From the budget:

{ "metric": "speed-index", "budget": 4000 },

From the CI log, only 2 failing checks:

Error: 2 results for http://localhost:3000/
Report: storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1642231709347-38880.report.html

first-contentful-paint failure for maxNumericValue assertion (First Contentful Paint: web.dev/first-contentful-paint )
Expected <= 2000, but found 2916.763

largest-contentful-paint failure for maxNumericValue assertion (Largest Contentful Paint: web.dev/lighthouse-largest-contentful-paint )
Expected <= 2500, but found 4613.237

But going to the report:

image

I see that "Speed Index" has a value of 4.2 s. It should be a failing check, right? The rest of the pages/budgets seem OK, this is the only that I see that seems like an error (on not showing the error).

@andreslucena
Copy link
Member

As talked in the last meeting, what we'll finally do:

  1. Change the budget times to something that will pass for sure (like 10000) - so that we will not have the "always in red" problem until we have the rest of the EPIC finished
  2. Merge this branch against develop
  3. When starting working in every one of the issues, we'll first change the budget to the one that the AC's for every issue says
  4. Then we'll merge against develop with the performance improvement and the correct budget

@andreslucena
Copy link
Member

I've changed the original issue (#8470) and the EPIC (#8495) accordingly with my last comment.

Copy link
Member

@andreslucena andreslucena left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few changes to reflect what we've talked today. Also, can you change the base branch to develop 🙏🏽 ?

.github/workflows/lighthouse_budget.json Outdated Show resolved Hide resolved
.github/workflows/lighthouse_budget.json Outdated Show resolved Hide resolved
.github/workflows/lighthouse_budget.json Outdated Show resolved Hide resolved
.github/workflows/lighthouse_budget.json Outdated Show resolved Hide resolved
ferblape and others added 4 commits January 26, 2022 03:55
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
Co-authored-by: Andrés Pereira de Lucena <andreslucena@users.noreply.github.com>
@ferblape ferblape changed the base branch from chore/performance-improvements to develop January 26, 2022 02:55
@ferblape
Copy link
Contributor Author

@andreslucena suggestions commited, now the performance check is passing

@andreslucena andreslucena changed the title Lighthouse metrics report Lighthouse metrics report in CI Jan 26, 2022
@andreslucena andreslucena merged commit 353f39d into develop Jan 26, 2022
@andreslucena andreslucena deleted the feature/time-monitoring-metrics-ci-8470 branch January 26, 2022 08:26
@ferblape ferblape mentioned this pull request May 19, 2022
12 tasks
@ferblape ferblape mentioned this pull request May 31, 2022
12 tasks
@alecslupu alecslupu added this to the 0.27.0 milestone Jul 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contract: PWA Barcelona City Council contract team: performance type: internal PRs that aren't necessary to add to the CHANGELOG for implementers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Configure time monitoring metrics at CI
3 participants