Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Simple HTTP client library
Python Shell
Branch: master

Fetching latest commit…

Cannot retrieve the latest commit at this time

Failed to load latest commit information.
examples
tests
.gitignore
NOTICE
README.md
simplefetch.py

README.md

simplefetch

Simple HTTP library. Initially it was forked from lyxint/urlfetch but now it's an independent project because of different internal architecture.

Example of usage aliases to GET method.

>>> import simplefetch
>>> resp  = simplefetch.get('http://devel.ownport.net')
>>> resp.headers
{ 'transfer-encoding': 'chunked', 'expires': 'Wed, 18 Jul 2012 04:58:49 GMT', 'server': 'GSE', 'last-modified': 'Wed, 11 Jul 2012 05:51:27 GMT', 'connection': 'Keep-Alive', 'etag': '"1fc3cfe5-7483-4765-8f67-eee40b813abc"', 'cache-control': 'private, max-age=0', 'date': 'Wed, 18 Jul 2012 04:58:49 GMT', 'content-type': 'text/html; charset=UTF-8' }
>>> len(resp.content)
86641

Making few requests to one host without re-establishing new connection

>>> conn = simplefetch.Connection(scheme='http', host='devel.ownport.net')
>>> conn.request('GET', '/', None, {} )
>>> resp = conn.response()
>>> resp.headers
{'transfer-encoding': 'chunked', 'expires': 'Wed, 18 Jul 2012 05:11:54 GMT', 'server': 'GSE', 'last-modified': 'Wed, 11 Jul 2012 05:51:27 GMT', 'connection': 'Keep-Alive', 'etag': '"1fc3cfe5-7483-4765-8f67-eee40b813abc"', 'cache-control': 'private, max-age=0', 'date': 'Wed, 18 Jul 2012 05:11:54 GMT', 'content-type': 'text/html; charset=UTF-8'}
>>> len(resp.content)
86641
>>> conn.request('GET', '/search/label/python', None, {} )
>>> resp = conn.response()
>>> resp.headers
{'transfer-encoding': 'chunked', 'expires': 'Wed, 18 Jul 2012 05:21:02 GMT', 'server': 'GSE', 'last-modified': 'Wed, 11 Jul 2012 05:51:27 GMT', 'connection': 'Keep-Alive', 'etag': '"1fc3cfe5-7483-4765-8f67-eee40b813abc"', 'cache-control': 'private, max-age=0', 'date': 'Wed, 18 Jul 2012 05:21:02 GMT', 'content-type': 'text/html; charset=UTF-8'}
>>> len(resp.content)
127584

Using custom headers

>>> simplefetch.get('http://devel.ownport.net', headers={'User-Agent': 'simplefetch/0.3.2'})

Using custom headers (via Headers class)

>>> headers = simplefetch.Headers()
>>> headers.basic_auth('username', 'password')
>>> simplefetch.get('http://www.example.com', headers=headers.items())

Automatic proxy support

>>> conn = simplefetch.get('http://devel.ownport.net')
>>> resp = conn.response()
>>> resp.headers
{'via': '1.1 PROXY', 'proxy-connection': 'Keep-Alive', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'expires': 'Wed, 18 Jul 2012 05:37:59 GMT', 'server': 'GSE', 'last-modified': 'Wed, 11 Jul 2012 05:51:27 GMT', 'connection': 'Keep-Alive', 'etag': '"1fc3cfe5-7483-4765-8f67-eee40b813abc"', 'cache-control': 'private, max-age=0', 'date': 'Wed, 18 Jul 2012 05:37:59 GMT', 'content-type': 'text/html; charset=UTF-8', 'x-xss-protection': '1; mode=block'}
>>> len(resp.content)
86641

or via Connection class

>>> conn = simplefetch.Connection(scheme='http')
>>> conn.request('GET', 'http://devel.ownport.net', None, {} )
>>> resp = conn.response()
>>> resp.headers
{'via': '1.1 PROXY, 'proxy-connection': 'Keep-Alive', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'expires': 'Wed, 18 Jul 2012 05:37:59 GMT', 'server': 'GSE', 'last-modified': 'Wed, 11 Jul 2012 05:51:27 GMT', 'connection': 'Keep-Alive', 'etag': '"1fc3cfe5-7483-4765-8f67-eee40b813abc"', 'cache-control': 'private, max-age=0', 'date': 'Wed, 18 Jul 2012 05:37:59 GMT', 'content-type': 'text/html; charset=UTF-8', 'x-xss-protection': '1; mode=block'}
>>> len(resp.content)
86641

Specifications (doctests)

TODO

  • sometimes encoding format coming from server in wrong format, make mapping known errors
  • handling exceptions for h.request/h.response
  • non-blocking thread-safe (is it really needed?). Instead of sharing HTTP connection between threads, one connection in one thread is used
  • working in Pool (thread support)
Something went wrong with that request. Please try again.