You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While working on #823, I realise that I'm somehow constantly involved in a process of making changes (to the docs and the parser) that changes the behaviour of existing links. During this process, it's likely that some links become broken, but left unnoticed because there are simply too many links to check manually.
In addition to having a rigorous unit/system test, one avenue where I often discover bugs is by building existing known, good, versions of large MarkBind projects such as CS2103T website or even our own MarkBind docs.
I wonder if it's feasible and/or worthwhile to write a small crawler to recursively check a site for any broken links. I would imagine that this crawler would have benefits for site authors too, since they can use it to quickly check if they have mis-referenced anything.
The text was updated successfully, but these errors were encountered:
Good idea. I sometimes use a tool called Xenu to check for broken links. I wonder if we can find a tool that can be integrated into CI so that the check is done every time. Could be useful for the MarkBind as well as sites using MarkBind.
MarkBind v2.1.0
While working on #823, I realise that I'm somehow constantly involved in a process of making changes (to the docs and the parser) that changes the behaviour of existing links. During this process, it's likely that some links become broken, but left unnoticed because there are simply too many links to check manually.
In addition to having a rigorous unit/system test, one avenue where I often discover bugs is by building existing known, good, versions of large MarkBind projects such as CS2103T website or even our own MarkBind docs.
I wonder if it's feasible and/or worthwhile to write a small crawler to recursively check a site for any broken links. I would imagine that this crawler would have benefits for site authors too, since they can use it to quickly check if they have mis-referenced anything.
The text was updated successfully, but these errors were encountered: