Skip to content

tyto-sec/HackerOneScraper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HackerOne Scraper

HackerOneScraper

last commit created language stars

HackerOneScraper is a Python tool that collects program scopes from the HackerOne API and exports the structured items to CSV files.


Features

  • Collects programs and scopes via the HackerOne API
  • Automatic pagination for programs and scopes
  • Credential-based auth via .env
  • Rotating proxy support via proxies.txt
  • Deploy and run with Scrapyd + ScrapydWeb

Requirements

  • Python 3.11+
  • Scrapyd and Scrapyd Client
  • ScrapydWeb and Logparser (optional, recommended)

Installation

pip install -r requirements.txt

Configuration

.env

Create a .env file with your HackerOne credentials:

HACKERONE_USERNAME=your_username
HACKERONE_TOKEN=your_token

proxies.txt

Add proxies in the following format:

username:password@ip:port

Deploy with Scrapyd

Install deploy dependencies:

pip install scrapyd scrapyd-client

Start Scrapyd:

scrapyd

Deploy the project:

scrapyd-client deploy

ScrapyWeb (optional)

Install the web panel:

pip install scrapydweb logparser

Start the services:

scrapyd &
scrapydweb &
logparser &

Scrapyd API

  • POST /schedule.json → Run spider
  • GET /listspiders.json → List spiders
  • GET /listprojects.json → Projects
  • GET /listjobs.json → Active jobs
  • POST /cancel.json → Stop job

Run

To schedule the spider via API:

curl http://localhost:6800/schedule.json -d "project=hackeronescraper" -d "spider=scopesspider"

Output

Items are exported according to the FEEDS setting in Scrapy or via spider run parameters.

About

Scraping tool for HackerOne structured scopes.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages