Feedparser seems to occasionally hang and has no timeout #76

Open
peterashwell opened this Issue Jul 10, 2016 · 2 comments

Comments

Projects
None yet
3 participants

According to this the default timeout in urllib2 is -1, or None. So... this is a problem for long running programs, when occasionally some connection will hang everything.

Solution is pretty simple, add a timeout to the 'open' here

f = opener.open(request)

I'll fork and try make a fix

rigid commented Feb 19, 2017

this issue seems like a real problem for there seems to be no clean workaround. Can't wait to see the next release because of that.

darklow commented Jul 4, 2017

If you want a quick workaround you can monkey patch and use requests lib instead with proper timeout. It also fixes https certificate issues I had with default feedparser url open implementation. This is how I do it:

import requests
import feedparser

feedparser._open_resource = lambda *args, **kwargs: feedparser._StringIO(requests.get(args[0], timeout=15).content)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment