Asynchronous proxy scraper written in python. Supports various proxy lists
- Proxy checker
- add more proxy lists
- replace selenium with pyppeteer
- Create python file in scrapers/
- Create class Info with following variables:
class Info:
name = ""
supported_types = ["http", "https", "socks4", "socks5"]
@staticmethod
async def geturl(ptype: str):
return f"https://domain.com/{ptype}"
supported_types
- list of strings
geturl()
- function that returns url for proxy list. Where ptype
is proxy type
- Then create function
scrape
wich should returnNone
. It must take 2 arguments:output
andptype
both are strings. Output should go inawait wtf(output, proxy)
- Done!