-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
avoid repeating invalid links #3
Comments
To give more context on this design decision: the output is currently geared towards writers. Writers want to see where the broken links are, and want to see all of them (even if they're duplicated in terms of destination). For dartlang.org, we're currently using linkcheck as 'webmasters'. We want to see what pages on the site are broken and we don't need a lot of insight on where those links are in the source pages. It's a different context and it needs a different way of sorting. I plan a This is just my thinking. I'm eager for input. |
That makes sense. Given this new understanding, I'd say that #2 is of higher priority, since for the use case I am targeting, most of the repeated links are ones that I'd want to exclude. |
Run the command
linkcheck https://webdev.dartlang.org
. Part of the output generated will be as shown below. Note that the two 404s are repeated 5 times. It would be nice to list the erroneous links only once.The text was updated successfully, but these errors were encountered: