Skip to content

555Russich/proxy_master

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 

Repository files navigation

This module provide asynchronously scraping a bunch of http, https, socks4, socks5 proxies from websites such as:

  1. free-proxy-list.net
  2. geonode.com
  3. hidemy.name

Just now proxy-master scraped 23265 proxies from websites above. Count of unique by ip:port is 18383, but some servers support different protocols.

I add more websites to scrap from time to time, BUT looking for help and advices from people with experience

Dependencies

python3.10 and higher. Pattern matching using in project

Will be installed automatically with pip

aiohttp bs4 lxml pycountry aiohttp_socks

Installation

pip install proxy-master

Usage

  1. This will create proxy_master.json in you home directory and return list of proxies.
import proxy_master as pm
proxies = pm.get_proxies(
    protocol='http',
    do_prints=True
)

Then you can test proxies:

working_proxies = pm.test_proxies(
    proxies,
    proxy_protocol: 'http',
    website_protocol: 'http'
)

Features

  • Scrap different type of proxies include https, socks4, socks5
  • Recursively scraping. Use already collected proxies to scrap another website
  • test_public_ip(...) socks4, socks5 using aiohttp-proxy

In plans