Static website testing through crawling. Point Lukki to a website and it'll find broken pages for you.
You can find all downloadable versions from the releases page.
Lukki accepts the following parameters:
-config(string): File to read configuration from (default "STDIN")
-format(string): Format of the report (default "ascii")
-output(string): File to write configuration to (default "STDOUT")
-version: Print version information
The configuration is provided in JSON format. These are the accepted configurations:
urls(list of strings): List of URLs to start crawling from
homeHosts(list of strings, optional): List of hosts that Lukki will crawl through. This can be used to prevent Lukki from crawling external sites. If not set, the hosts from
urlsparameter will be used.
userAgent(string, optional): The user agent the crawler will use
ignoreRobotsTxt(boolean, optional): Whether or not Lukki should ignore
robots.txtdirectives it finds. Default:
parallelism(integer, optional): Number of parallel workers used for crawling. If
0is provided, the limit is disabled. Default: no limit
elements(list of maps, optional): Which HTML elements to find links from. Each element should contain a
namefor the HTML element name (e.g.
attributefor HTML element attribute (e.g.
href). Default elements:
Lukki is written in Go and it uses Go modules. Go version 1.11+ is required. To get started, first fetch the dependencies:
$ go mod download
To run tests:
$ go test ./...
$ go build
GNU General Public License v3.0
See LICENSE file for more information.