Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crawler to check for broken links (Feature Suggestion) #824

Closed
sijie123 opened this issue Apr 7, 2019 · 2 comments
Closed

Crawler to check for broken links (Feature Suggestion) #824

sijie123 opened this issue Apr 7, 2019 · 2 comments

Comments

@sijie123
Copy link
Contributor

sijie123 commented Apr 7, 2019

  • MarkBind Version:
    MarkBind v2.1.0

While working on #823, I realise that I'm somehow constantly involved in a process of making changes (to the docs and the parser) that changes the behaviour of existing links. During this process, it's likely that some links become broken, but left unnoticed because there are simply too many links to check manually.

In addition to having a rigorous unit/system test, one avenue where I often discover bugs is by building existing known, good, versions of large MarkBind projects such as CS2103T website or even our own MarkBind docs.

I wonder if it's feasible and/or worthwhile to write a small crawler to recursively check a site for any broken links. I would imagine that this crawler would have benefits for site authors too, since they can use it to quickly check if they have mis-referenced anything.

@damithc
Copy link
Contributor

damithc commented Apr 7, 2019

Good idea. I sometimes use a tool called Xenu to check for broken links. I wonder if we can find a tool that can be integrated into CI so that the check is done every time. Could be useful for the MarkBind as well as sites using MarkBind.

@ang-zeyu ang-zeyu added p.Medium and removed p.Low labels Aug 10, 2020
@ang-zeyu
Copy link
Contributor

Raising this to medium as its quite a staple.

This can be done within the existing code for performance though, no need for a separate tool / crawler stage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants