--> 🌐 Total
[+] New/ReNewed SSL Certs (ALL): +0
--> 🇳🇵 np_ccTLDs
[+] New/ReNewed SSL Certs (ALL): +0
[+] New/ReNewed SSL Certs (Edu): +0
[+] New/ReNewed SSL Certs (Gov|Mil): +0
[+] New/ReNewed SSL Certs (ISPs): +0
--> 🌐 Total
[+] New/ReNewed SSL Certs (ALL): +51053553
--> 🇳🇵 np_ccTLDs
[+] New/ReNewed SSL Certs (ALL): +23920
[+] New/ReNewed SSL Certs (Edu): +1915
[+] New/ReNewed SSL Certs (Gov|Mil): +461
[+] New/ReNewed SSL Certs (ISPs): +8
- [Automated | UpToDate] Daily (@24 Hrs) Dumps of CertStream Certificate Logs Data
- All the Scripts & Tools used are OpenSource & Public, as such all this comes with no Guarantees | Liabilities.
- Due to Github's File Size Limit, all data is Compressed using 7z.
- View Latest Data from the last 24 Hr at: Raw/Latest
Download
: certstream_domains_latest.txt (Warning: May Crash Browser)!# Download with wget wget "https://r2-pub.prashansa.com.np/certstream_domains_latest.txt" !# View without Downloading (Spikes Memory Usage) curl -qfsSL "https://r2-pub.prashansa.com.np/certstream_domains_latest.txt" | less
Parse
(If for some reason, you want to do it manually)!# Create a Directory mkdir "./certstream-latest" && cd "./certstream-latest" !# Download all .7z file for url in $(curl -qfsSL "https://api.github.com/repos/Azathothas/CertStream-Domains/contents/Raw/Latest" -H "Accept: application/vnd.github.v3+json" | jq -r '.[].download_url'); do echo -e "\n[+] $url\n" && curl -qfLJO "$url"; done !# Extract all .7z files !# Install 7z: sudo curl -qfsSL "https://raw.githubusercontent.com/Azathothas/Toolpacks/main/x86_64/7z" -o "/usr/local/bin/7z" && sudo chmod +xwr "/usr/local/bin/7z" find . -iname "*.7z" -exec sh -c '7z x "{}" -o"$(dirname "{}")/$(basename "{}" .7z)"' \; !# Cat all to a single text file find . -type f -iname "certstream_domains.txt" -exec cat {} \; 2>/dev/null | sort -u -o "./certstream_domains_latest.txt" ; wc -l < "./certstream_domains_latest.txt" !# Del .7z files find . -maxdepth 1 -type f -iname "certstream*.7z" -exec rm {} \; 2>/dev/null
- View Archival Data upto 7 Days at: Raw/Archive
Download
: certstream_domains_weekly.txt (Warning: May Crash Browser)!# Download with wget wget "https://r2-pub.prashansa.com.np/certstream_domains_weekly.txt" !# View without Downloading (DANGEROUS for your CPU/RAM) curl -qfsSL "https://r2-pub.prashansa.com.np/certstream_domains_weekly.txt" | less
Parse
(If for some reason, you want to do it manually)!# Create a Directory mkdir "./certstream-7days" && cd "./certstream-7days" !# Download all .7z file for url in $(curl -qfsSL "https://api.github.com/repos/Azathothas/CertStream-Domains/contents/Raw/Archive" -H "Accept: application/vnd.github.v3+json" | jq -r '.[].download_url'); do echo -e "\n[+] $url\n" && curl -qfLJO "$url"; done !# Extract all .7z files !# Install 7z: sudo curl -qfsSL "https://raw.githubusercontent.com/Azathothas/Toolpacks/main/x86_64/7z" -o "/usr/local/bin/7z" && sudo chmod +xwr "/usr/local/bin/7z" find . -iname "*.7z" -exec sh -c '7z x "{}" -o"$(dirname "{}")/$(basename "{}" .7z)"' \; !# Cat all to a single text file find . -type f -iname "certstream_domains.txt" -exec cat {} \; 2>/dev/null | sort -u -o "./certstream_domains_7days.txt" ; wc -l < "./certstream_domains_7days.txt" !# Del .7z files find . -maxdepth 1 -type f -iname "certstream*.7z" -exec rm {} \; 2>/dev/null
- Do Whatever/However you want !
Blue Teamers
: Monitor forPhising Domains
Red Teamers
||Bug Bounty Hunters
: Monitor fornew assets
for your targetStatisticians
||Chad Data Analysts
:Have Fun
- Info: Certificate Transparency Logs only list issuance of website certificates. This data may not necessarily indicate newly registered domains, as Certificates expire and are renewed frequently.
- Instead, use cemulus/crt to check their history:
!# Install: sudo curl -qfsSL "https://raw.githubusercontent.com/Azathothas/Toolpacks/main/x86_64/crt" -o "/usr/local/bin/crt" && sudo chmod +xwr "/usr/local/bin/crt"
- Check
crt "$domain_name" !# Example: crt "rmb.info.np"
- Details
crt -json "$domain_name" !# Example: crt -json "rmb.info.np"
Note: This is just an example, the full data contains logs from every country (
TLD
), Worldwide.!# Ref: https://register.com.np/np-ccTLDs com.np | coop.np | edu.np | gov.np | info.np | mil.np | name.np | net.np | org.np !# ISPs CG Net | ClassicTech | Ncell | NTC | Subisu | Vianet | Wordlink !# Parsed: (Main) grep -Ei 'com\.np|coop\.np|edu\.np|gov\.np|info\.np|mil\.np|name\.np|net\.np|org\.np' "certstream_domains_latest.txt" | sort -u !# Parsed: (ISPs) grep -i 'cgnet.com.np\|classic.com.np\|ncell.axiata.com\|ncell.com.np\|nettv.com.np\|ntc.net.np\|snpl.net.np\|subisu.net.np\|vianet.com.np\|via.net.np\|viatv.com.np\|wlink.com.np\|wlinktech.com.np\|worldlink.com.np' "certstream_domains_latest.txt" | sort -u !# Grep for something Particular !# Example: List only .gov grep -Ei 'gov\.np' "certstream_domains_np_24h.txt" | sort -u !# DL: !# ALL wget "https://raw.githubusercontent.com/Azathothas/CertStream-Domains/main/Data/np_ccTLDs/certstream_domains_np_all_24h.txt" View: curl -qfsSL "https://raw.githubusercontent.com/Azathothas/CertStream-Domains/main/Data/np_ccTLDs/certstream_domains_np_all_24h.txt" | less !# Only edu.np wget "https://raw.githubusercontent.com/Azathothas/CertStream-Domains/main/Data/np_ccTLDs/certstream_domains_np_edu_24h.txt" View: curl -qfsSL "https://raw.githubusercontent.com/Azathothas/CertStream-Domains/main/Data/np_ccTLDs/certstream_domains_np_edu_24h.txt" | less !# Only gov.np | mil.np wget "https://raw.githubusercontent.com/Azathothas/CertStream-Domains/main/Data/np_ccTLDs/certstream_domains_np_gov_mil_24h.txt" View: curl -qfsSL "https://raw.githubusercontent.com/Azathothas/CertStream-Domains/main/Data/np_ccTLDs/certstream_domains_np_gov_mil_24h.txt" | less !# Only ISP wget "https://raw.githubusercontent.com/Azathothas/CertStream-Domains/main/Data/np_ccTLDs/certstream_domains_np_isp_24h.txt" View: curl -qfsSL "https://raw.githubusercontent.com/Azathothas/CertStream-Domains/main/Data/np_ccTLDs/certstream_domains_np_isp_24h.txt" | less
- Azathothas/CertStream-Domains is an Append Only (
RFC 6962
)[https://datatracker.ietf.org/doc/html/rfc6962] dumps of logs.- Azathothas/CertStream-Domains only extracts SAN & CN from ct-logs.
- It is not a database for pre-existing ones.
- There exist a million projects that do the Collection/Database thing a million times better than this repo could ever do. So look elsewhere if you want a DB of certificates & all the data.
- There used to be internetwache/CT_subdomains which was very similar to this repo. But it didn't list everything, and also hasn't been updated since
Oct 13, 2021
. Read their Blog- crt.sh also monitors the same logs, but there's a delay (usually 24 Hrs) until it shows up on results. Furthermore, you will have to use additional filters to only list newly issued/renewed certs.
- Services like SSLMate, Report-Uri & SecurityTrails either
monitor only your domains
||do not provide all the data
||Put it behind Paywalls
.
- SSLMate has opensourced their own monitor: SSLMate/certspotter but the data is behind a paywall.
- certstream.calidog.io uses it's own Server to fetch all logs exposing
wss://certstream.calidog.io
for libraries. Azathothas/certstream is a simple cli that uses the go library.- List of logs monitored: https://www.gstatic.com/ct/log_list/v3/all_logs_list.json
- Use something like mouday/domain-admin if looking to monitor only specific domains.
- Use something like letsencrypt/ct-woodpecker/ct-woodpecker if looking for detailed output with stats & monitors (Prometheus) for Production.
- Use something like drfabiocastro/certwatcher if looking for Automation. This is essentially nuclei for cert-logs.
- The Hacker's Choice for proividing servers on segfault & being so generous.
- Telegram :
@thcorg
| Github : https://github.com/hackerschoice