Static website testing through crawling. Point Lukki to a website and it'll find broken pages for you.
You can find all downloadable versions from the releases page.
Lukki accepts the following parameters:
-config
(string): File to read configuration from (default "STDIN")-format
(string): Format of the report (default "ascii")-output
(string): File to write configuration to (default "STDOUT")-version
: Print version information
The configuration is provided in JSON format. These are the accepted configurations:
urls
(list of strings): List of URLs to start crawling fromhomeHosts
(list of strings, optional): List of hosts that Lukki will crawl through. This can be used to prevent Lukki from crawling external sites. If not set, the hosts fromurls
parameter will be used.userAgent
(string, optional): The user agent the crawler will useignoreRobotsTxt
(boolean, optional): Whether or not Lukki should ignorerobots.txt
directives it finds. Default:true
parallelism
(integer, optional): Number of parallel workers used for crawling. If0
is provided, the limit is disabled. Default: no limitelements
(list of maps, optional): Which HTML elements to find links from. Each element should contain aname
for the HTML element name (e.g.a
), andattribute
for HTML element attribute (e.g.href
). Default elements:a.href
,link.href
,img.src
,script.src
.
Lukki is written in Go and it uses Go modules. Go version 1.11+ is required. To get started, first fetch the dependencies:
$ go mod download
To run tests:
$ go test ./...
To build:
$ go build
GNU General Public License v3.0
See LICENSE file for more information.