Crawl fetches, sorts and outputs a number and unique URLs from the given one. It makes available content easy to find.
Install binary. more details. Hint if you have a trouble:
# navigate to directory and run:
go install .
# follow output
# for linux add to ~/.bashrc:
export GOPATH=$HOME/go
export GOBIN=$GOPATH/bin
export PATH=$PATH:$GOBIN
# source it and run again
go install .
usage:
crawl -url http://go.dev
output:
...
https://go.dev
https://go.dev/blog
https://go.dev/brand
...
Additionally you could wrap it with torify. Follow this clear instructions from the source
usage with torify:
torify crawl -url http://go.dev