Skip to content
Crawl and validate proxies from Internet
Branch: master
Clone or download
Latest commit 09cce07 Dec 8, 2016
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data Add support for https Dec 8, 2016
17monipdb.dat change ip to area db Oct 19, 2016
DB.py Some small update Dec 8, 2016
README.md Add support for https Dec 8, 2016
__init__.py
api.py Add support for https Dec 8, 2016
config.py Some small update Dec 7, 2016
crawler.py Some small update Dec 7, 2016
ipip.py
logger.py add project files May 18, 2016
proxypool.py Some small update Dec 7, 2016
proxysites.py Some small update Dec 7, 2016
test.py
ua.json Add fake-useragent file Nov 5, 2016
ua.py Some small update Dec 7, 2016
validator.py Add support for https Dec 8, 2016

README.md

ProxyPool

Crawl and validate proxies from Internet

Features

fake-useragent from https://github.com/hellysmile/fake-useragent

Requirement

requests
gevent
lxml
beautifulsoup4

How to use

just run in terminal

python proxypool.py

then

http://localhost:8000

or

http://localhost:8000/?num=1&port=80&type=3&protocol=http&minscore=0&area=北京

Other

the parameter "type" means anonymous level
0: unknown
1: transparent
2: anonymous
3: high anonymous
You can’t perform that action at this time.