Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Connection pool is full, discarding connection" warning #49

Closed
jjsendor opened this issue Feb 3, 2016 · 4 comments
Closed

"Connection pool is full, discarding connection" warning #49

jjsendor opened this issue Feb 3, 2016 · 4 comments
Assignees
Labels

Comments

@jjsendor
Copy link
Contributor

jjsendor commented Feb 3, 2016

This warning pops up sometimes when running the OSXCollector Analyze Filter:

requests.packages.urllib3.connectionpool: WARNING  Connection pool is full, discarding connection: investigate.api.opendns.com

Not sure if it means that a particular connection is dropped and the results are never gonna be obtained for a certain domain.

@jjsendor
Copy link
Contributor Author

jjsendor commented Mar 13, 2017

As per requests documentation in the module requests.packages.urllib3.connectionpool:

If the pool is already full, the connection is closed and discarded because we exceeded maxsize. If connections are discarded frequently, then maxsize should be increased.

I don't think this is a big problem, but we may need to increase the pool size, so that the connections are not discarded that often.

@blischalk
Copy link

I am receiving the folowing error:

WARNING:urllib3.connectionpool:Connection pool is full, discarding connection: www.virustotal.com

Many times when trying to get a report for ~2000 domains. Has this connection pool issue been resolved?

I get it if I send all 2000 at once or if I iterate over the domains making an API call per domain.

#!/usr/bin/python
import os
import time
from threat_intel.virustotal import VirusTotalApi


vt = VirusTotalApi(os.environ["VIRUS_TOTAL"], resources_per_req=1)
fo = open("questionable_domains.txt", "r")
domains = fo.readlines()

# print(vt.get_domain_reports(domains))

for domain in domains:
    print(vt.get_domain_reports(domain))

@jjsendor jjsendor self-assigned this Dec 5, 2017
@leeren leeren reopened this Dec 6, 2017
@leeren
Copy link
Contributor

leeren commented Dec 6, 2017

The reason for this issue was that the customized SSL Adapter mounted to each session was using a max HTTPConnectionPool size of 10. Specifically, this meant that each connection pool corresponding to a specific host could only issue 10 requests asynchronously per batch before filling up and discarding the connection. The fix was as simple as specifying for the adapter to take in a maximum pool size corresponding to the max of the grequest batch size and 10 (default max pool size used by the HTTP adapter).

@leeren
Copy link
Contributor

leeren commented Dec 6, 2017

Fixed with #74.

@leeren leeren closed this as completed Dec 6, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants