Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add HTML validator action #579

Merged
merged 14 commits into from
Dec 30, 2021
Merged

Conversation

krassowski
Copy link
Member

@krassowski krassowski commented Dec 29, 2021

Adds four CI jobs:

  • build suite using Jekyll
  • validate HTML
  • check links (excluding absolute / links and LinkedIn links for now due to limitations of checker)
  • run Lighthouse audits:
    • those get uploaded to temporary public storage on Google servers and can be accessed easily from the browser for 7 days

* Add HTML validator action

* Update root to `_site`

* Update validate.yml

* Update validate.yml

* Update validate.yml

* Add check-links

* Update validate.yml

* Add lighthouse and ignores for URLs

* Add lighthouse config

* Update validate.yml
@krassowski krassowski changed the title Add HTML validator action (#1) Add HTML validator action Dec 29, 2021
@krassowski
Copy link
Member Author

The failures are intentional. I propose to merge and iterate by fixing the detected issues in subsequent PRs.

Should we set a threshold for lighthouse score?

Copy link
Collaborator

@choldgraf choldgraf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This all sounds good to me, I think it's a great idea to add in some basic checks for reasonableness

- name: Build the site in the Jekyll/builder container
run: |
docker run \
-v ${{ github.workspace }}:/srv/jekyll -v ${{ github.workspace }}/_site:/srv/jekyll/_site \
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have nox usable with this repository now, so we could just re-use that here via a verb like nox -s build. That might reduce some redundancy.

@nox.session(venv_backend='conda')
def build(session):
install_deps(session)
session.run(*"bundle exec jekyll serve liveserve".split())

We'd need to update that so that there was a build verb that didn't also do liveserve but that would be easy

Copy link
Collaborator

@choldgraf choldgraf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

quick idea on naming!

noxfile.py Outdated Show resolved Hide resolved
noxfile.py Outdated Show resolved Hide resolved
@krassowski
Copy link
Member Author

So I was enthusiastic about using common nox command here at first, but I am no longer convinced. The old docker-based approach was building in 51 seconds, the nox/conda takes 6 minutes 24 seconds. It is a substantial delay. I can go for mamba, cache etc but it will all increase complexity and maintenance burden - caching is hard.

@choldgraf
Copy link
Collaborator

Huh, that's surprising. But if it's that big a difference I agree. We could probably optimize the nox build down but that's not the best use of time in this PR in my opinion, unless you're enthusiastic to try it out

@krassowski
Copy link
Member Author

I guess it comes down to having the docker image already available on GitHub infrastructure (the docker image is the one from the action suggested by GitHub) vs having to resolve dependencies and install them from conda - I would not expect a big difference in the actual build step.

@choldgraf
Copy link
Collaborator

Yeah for sure - maybe with mamba it'd be significantly faster? This also seems like it'd be a one time thing I'd we cached it properly. Environments all get stored in a .nox folder so maybe we could just cache that?

@krassowski
Copy link
Member Author

Can we address the install via nox/mamba/caching in a follow-up PR? Automation is exhausting ;)

@choldgraf
Copy link
Collaborator

Totally, I'm just thinking out loud

@krassowski
Copy link
Member Author

Currently the lighthouse result show up as:

Total size of all the files uploaded is 1165820 bytes
Finished uploading artifact lighthouse-results. Reported size is 1165820 bytes. There were 0 items that failed to upload
Uploading median LHR of http://localhost:45821/about.html...success!
Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813497053-27107.report.html
Uploading median LHR of http://localhost:45821/binder.html...success!
Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813497456-16487.report.html
Uploading median LHR of http://localhost:45821/community.html...success!
Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813497852-78305.report.html
Uploading median LHR of http://localhost:45821/documentation.html...success!
Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813498306-7357.report.html
Uploading median LHR of http://localhost:45821/hub.html...success!
Open the report at https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813498846-65386.report.html
No GitHub token set, skipping GitHub status check.
Dumping 5 reports to disk at /home/runner/work/jupyter.github.io/jupyter.github.io/.lighthouseci...
Done writing reports to disk.

Error: 2 results for http://localhost:45821/about.html
Report: https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813497053-27107.report.html

categories.accessibility failure for minScore assertion
Expected >= 0.95, but found 0.94

categories.best-practices failure for minScore assertion
Expected >= 0.95, but found 0.93
Error: 2 results for http://localhost:45821/binder.html
Report: https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813497456-16487.report.html

categories.accessibility failure for minScore assertion
Expected >= 0.95, but found 0.85

categories.seo failure for minScore assertion
Expected >= 0.95, but found 0.92
Error: 1 result for http://localhost:45821/documentation.html
Report: https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813498306-7357.report.html

categories.accessibility failure for minScore assertion
Expected >= 0.95, but found 0.94

But we could go fancy and have:

image

This would require authorizing https://github.com/apps/lighthouse-ci ; it is tied to a private account of maintainer but also mentioned on https://github.com/GoogleChrome/lighthouse-ci/blob/main/docs/getting-started.md#github-status-checks.

Just an option - I don't have opinion on this one.

@choldgraf
Copy link
Collaborator

I'd be +1 on this personally, though also don't think we should block this PR on that if it'll require extra decision-making steps.

Is there a place where we can look at the lighthouse report itself, to quickly spot-check improvements we need to make? (e.g., similar to how the codecov reports can work?)

@krassowski
Copy link
Member Author

krassowski commented Dec 29, 2021

Is there a place where we can look at the lighthouse report itself, to quickly spot-check improvements we need to make? (e.g., similar to how the codecov reports can work?)

The links in the logs should work, for example https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813497053-27107.report.html for about.html highlights that the change of the links colour to orange reduced accessibility. Edit: this is temporary storage will only work for 7 days. I added note to top level comment.

@krassowski krassowski mentioned this pull request Dec 29, 2021
12 tasks
Copy link
Collaborator

@choldgraf choldgraf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great to me - only two quick comments, and one suggestion below:

Could you add a short section to the README.md file to explain this workflow? Just a short explanation of these validation steps, and short instructions for how people can view a lighthouse report. Whatever somebody might need to understand how to make the use of this helpful CI/CD infrastructure.

.github/workflows/validate.yml Show resolved Hide resolved
validate:

runs-on: ubuntu-latest
name: Validate HTML
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we add a comment somewhere just saying what we mean by "validate"? e.g., are we just doing some kind of basic integrity check? And would this not be captured by lighthouse already?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some of them overlap (like alt meta tags for images, but others are not caught by Lighthouse).

By validate we mean run though W3C-sanctioned HTML integrity validator. In this case we use https://github.com/validator (https://validator.github.io/validator/) backend "Nu". It is featured on W3C: https://validator.w3.org/nu/ and https://validator.w3.org/docs/help.html#validation_basics describes what validation means to them.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then maybe we could just add a comment like # See https://validator.w3.org/docs/help.html#validation_basics for more information?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added link to that to README in e62e6b0

@choldgraf
Copy link
Collaborator

choldgraf commented Dec 29, 2021

Also just a note that I think one of the failing accessibility checks is for the new "jupyter orange" link colors that @palewire recently added in #558 :

https://storage.googleapis.com/lighthouse-infrastructure.appspot.com/reports/1640813497053-27107.report.html#accessibility

I wonder if we could darken those a bit to improve accessibility? I know that's technically no-longer "Jupyter orange", but also think we should prioritize accessibility best practices over strict adherence to design guidelines

@palewire
Copy link
Collaborator

Yes, I can darken it with a separate PR. Hmm. The default brand color isn't accessible on a white background? That might be a separate, larger problem.

@palewire
Copy link
Collaborator

Also, the fact that it was surfaced is vindication of this excellent idea for a test!

@palewire
Copy link
Collaborator

Patch: #581
Bug report: jupyter/design#67

@palewire
Copy link
Collaborator

This is super super cool, IMHO, and potentially a model for many other publishers. You want to merge it first, or try to sort out the many linty things it raises? I'm fine with a merge.

@krassowski
Copy link
Member Author

I would say it is better to merge now and iterate. @sunnyjain15699 volunteered to address the validator failures (#526 (comment)) so I did not attempt to squash them and focused on automation work, but I did fix some link issues.

@choldgraf
Copy link
Collaborator

@krassowski you ready for a merge now? Or other stuff you wanna do?

@krassowski
Copy link
Member Author

Ready to merge!

@krassowski
Copy link
Member Author

Ok, I extracted the follow up tasks to new issues and will merge this one in 10 minutes if there are no further comments.

@palewire
Copy link
Collaborator

palewire commented Dec 30, 2021

Congratulations. This is well done. I'm already thinking about how how team at latimes.com could add something like this to our build tool https://github.com/datadesk/baker-example-page-template

@choldgraf choldgraf merged commit 485005d into jupyter:master Dec 30, 2021
@choldgraf
Copy link
Collaborator

Then let's do it! Thanks @krassowski 🙂 and thanks @palewire for the reviews and ideas

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants