New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feedparser seems to occasionally hang and has no timeout #76

Open
peterashwell opened this Issue Jul 10, 2016 · 2 comments

Comments

Projects
None yet
3 participants
@peterashwell

peterashwell commented Jul 10, 2016

According to this the default timeout in urllib2 is -1, or None. So... this is a problem for long running programs, when occasionally some connection will hang everything.

Solution is pretty simple, add a timeout to the 'open' here

f = opener.open(request)

I'll fork and try make a fix

@rigid

This comment has been minimized.

rigid commented Feb 19, 2017

this issue seems like a real problem for there seems to be no clean workaround. Can't wait to see the next release because of that.

@darklow

This comment has been minimized.

darklow commented Jul 4, 2017

If you want a quick workaround you can monkey patch and use requests lib instead with proper timeout. It also fixes https certificate issues I had with default feedparser url open implementation. This is how I do it:

import requests
import feedparser

feedparser._open_resource = lambda *args, **kwargs: feedparser._StringIO(requests.get(args[0], timeout=15).content)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment