HackerOneScraper is a Python tool that collects program scopes from the HackerOne API and exports the structured items to CSV files.
- Collects programs and scopes via the HackerOne API
- Automatic pagination for programs and scopes
- Credential-based auth via .env
- Rotating proxy support via proxies.txt
- Deploy and run with Scrapyd + ScrapydWeb
- Python 3.11+
- Scrapyd and Scrapyd Client
- ScrapydWeb and Logparser (optional, recommended)
pip install -r requirements.txtCreate a .env file with your HackerOne credentials:
HACKERONE_USERNAME=your_username
HACKERONE_TOKEN=your_token
Add proxies in the following format:
username:password@ip:port
Install deploy dependencies:
pip install scrapyd scrapyd-clientStart Scrapyd:
scrapydDeploy the project:
scrapyd-client deployInstall the web panel:
pip install scrapydweb logparserStart the services:
scrapyd &
scrapydweb &
logparser &- POST /schedule.json → Run spider
- GET /listspiders.json → List spiders
- GET /listprojects.json → Projects
- GET /listjobs.json → Active jobs
- POST /cancel.json → Stop job
To schedule the spider via API:
curl http://localhost:6800/schedule.json -d "project=hackeronescraper" -d "spider=scopesspider"Items are exported according to the FEEDS setting in Scrapy or via spider run parameters.
