Search for URLs in files (optionally limited by extension) and validate their HTTP status code
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
tests
.travis.yml
LICENSE.md
Makefile
README.md
linkcheck

README.md

linkcheck

Build Status Tag License

linkcheck was created in order to ensure that none of your files have broken links.

I use it personally to check my documentation, notes and recipes every once in a while for valid URLs. I am aware that in case of Sphinx documentation there is already a bundled linkcheck tool. However this does not work on raw:: sections.

Requirements

  • bash
  • curl

Examples

Scan all files in current directory for httpd 200

# Ensure all URLs found in all files in current directory return 200
linkcheck

Scan Markdown and text files

# Ensure either 200, 301 or 302 are returned
linkcheck -e 'md,txt' -c '200,301,302' path/to/my/docs

Be strict on sphinx files

# Ensure only 200 is returned
linkcheck -e 'rst' -c '200' path/to/my/docs

Ignore specific URLs

# Ignore localhost, 127.0.0.1 and *.loc domains
linkcheck -i '^http(s)?:\/\/(localhost)|(127\.0\.0\.1|.)|(.+\.loc).*$' path/to/my/docs

Ignore invalid SSL certificates

linkcheck -k -c '200' path/to/my/docs

Follow redirects and only evaluate final HTTP code

# Ensure only 200 is returned from the last redirected page
linkcheck -l -c '200' path/to/my/docs

Usage

Usage: linkcheck [-e -i -t -r -c -k -l] [<path>]
       linkcheck --version
       linkcheck --help


Options:

-e        Limit search to those file extensions.
          Defaults to limiting on non-binary files.
          Accepts comma separated string of extensions:
            -e txt
            -e txt,rst
            -e sh,py.c,h

-i        Ignore all URLs matching the specified regex.
          Defaults to: ^http(s)?:\/\/(127\.0\.0\.1)|(localhost).*$
          Accepts a single regex string:
            -i '^http(?):\/\/my-comapny.com.*$'

-t        Specify curl timeout in seconds, after which probing stops for one url.
          Defaults to 10 seconds.
          Accepts a positive integer:
            -t 5
            -t 10

-r        Specify how many time to retry probing a single URL, before giving up.
          Defaults to 3 times.
          Accepts a positive integer:
            -r 5
            -r 10

-c        Specify HTTP status codes that are valid for success.
          Any code not specified in here will produce an error for the given URL.
          Defaults to '200'.
          Accepts comma separated string of http status codes:
            -c '200'
            -c '200,301'
            -c '200,301,302'

-k        Ignore invalid SSL certificates for HTTPS connections.
          Defaults to error on invalid SSL certificates.
          This is just a single flag with no other arguments.

-l        Specify whether to follow redirect URLs or not.
          This argument does not accept parameters.
          Defaults to not following redirects.

--version Show version and exit.
--help    Show this help screen.


Optional arguments:

<path>    Specify what directory to scan files for URLs.
          Defaults to current directory.

License

MIT License

Copyright (c) 2018 cytopia