-
-
Notifications
You must be signed in to change notification settings - Fork 346
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feedparser seems to occasionally hang and has no timeout #76
Comments
this issue seems like a real problem for there seems to be no clean workaround. Can't wait to see the next release because of that. |
If you want a quick workaround you can monkey patch and use import requests
import feedparser
feedparser._open_resource = lambda *args, **kwargs: feedparser._StringIO(requests.get(args[0], timeout=5).content) Update: On versions above 6.x use following: import requests
import feedparser
feedparser.api._open_resource = lambda *args, **kwargs: requests.get(args[0], headers=headers, timeout=5).content |
above did the job for my error: have very simple app polling, once in a while feedparser does not return and needs 2x ^C to exit the script, and it then prints:
Not sure if related and if above fixes this. I am using the latest pip install version |
having a broken implementation leads to devs doing workarounds like this that then have issues and other devs just copy-paste wrong solutions, this will ignore |
feedparser has dropped all custom HTTP client code in favor of the I'm closing this issue for this reason. |
According to this the default timeout in urllib2 is -1, or None. So... this is a problem for long running programs, when occasionally some connection will hang everything.
Solution is pretty simple, add a timeout to the 'open' here
feedparser/feedparser/http.py
Line 175 in 39a7157
I'll fork and try make a fix
The text was updated successfully, but these errors were encountered: