Skip to content

trixiahorner/Spidertrap

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

Spidertrap

Introduction

Trap web crawlers and spiders in an infinite set of dynamically generated webpages! The Spidertrap lab is designed to demonstrate how we can create a web of decoy pages that ensnare automated web crawlers and malicious actors.

Walkthrough

The following lab is from Antisyphon Training's Active Defense and Cyber Deception course. I will be using a script file containing a list of webpage names to serve, one per line. If no file is provided, random links will be generated.

Step 1: get IP of Linux system and then cd into the proper directory

IP

Step 2: run script

python3 spidertrap.py

script

Step 3: visit web server : port 8000

http://172.28.3.40:8000

I see a page containing randomly generated links. When I click on a link it takes me to a page with more randomly generated links. This is essentially the spidertrap.

web
web2

Step 4: Now I rerun Spidertrap, but this time I use it with a file. Web crawlers are looking for specific directories, so let's make it look more like a legit directory.

kill the last Spidertrap session

python3 spidertrap.py directory-list-2.3-big.txt

web2 web2 web2

Step 5: Finally, I run Spidertrap one last time, but this time I use wget to mirror the website

  • (-m): This option stands for "mirror". It tells wget to download the entire website in a way that creates a local copy of the site. It will download not just the specified file but also any files linked from it, and so on. This option also allows for infinite recursion, so it will follow links to any depths.
sudo wget -m http://127.0.0.1:8000

web2 web2

Wget is a script to download files from the web, but spidertrap is a tool that when an attacker gets into that folder, it keeps him in there. It’s crawling the cyber deception webpage, so it keeps crawling random pages.

Conclusion

By generating a labyrinth of infinitely nested and randomly generated pages, a spidertrap can exhaust an attacker’s resources, disrupt their activities, and provide crucial insights into their tactics. The concept of a spidertrap offers an active approach to defending web resources.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published