Skip to content

4rkal/crawlr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

crawlr

Find and fix broken links in your websites

Screenshot

Features

  • Extracts and checks links from a webpage.

  • Saves results in urls.csv.

  • Uses concurrency for faster scanning.

Install

Install using go: go install github.com/4rkal/crawlr@latest

OR

Clone the repo:

git clone https://github.com/4rkal/crawlr

cd crawlr

go run .

Usage

Enter website url (including https://)

Read urls.csv and find broken links.

About

Find (and fix) broken links in your website

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages