Skip to content
master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
SS
 
 
 
 
 
 
 
 

Links-Crawler

Simple web crawler for finding endpoints.

Features:

  1. All crawled URLs are organized by page extensions.
  2. All parameters of same URL are organized and displayed together.

#Running from terminal

Alt text

#Accessing crawled links

Alt text

Alt text

#Installation

pip3 install nyawc

git clone https://github.com/rakeshmane/Links-Crawler.git

cd Links-Crawler

python3 Links_Crawler.py

About

Simple web crawler to crawl website and organise URLs based on page extensions.

Resources

Releases

No releases published

Packages

No packages published