Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Web API gets stuck when requesting a lot of torrent properties in a short time #10487

Closed
jerrymakesjelly opened this Issue Apr 13, 2019 · 6 comments

Comments

Projects
None yet
4 participants
@jerrymakesjelly
Copy link

jerrymakesjelly commented Apr 13, 2019

qBittorrent version and Operating System

qBittorrent v4.1.5 on Ubuntu 18.04

Actually I installed qBittorrent from the docker: https://hub.docker.com/r/linuxserver/qbittorrent.

The build version is Linuxserver.io version:- 4.1.5.99201903251247-6693-8f6c305ubuntu18.04.1-ls14 Build-date:- 2019-04-03T04:02:19+00:00.

If on linux, libtorrent and Qt version

libtorrent 1.1.11 and Qt 5, I guess.

What is the problem

I have 500 small torrents (95 KiB - 36.5MiB) in my qBittorrent.

I'm using a Python script to query the general properties and the trackers of all the torrents, but it will get stuck randomly at one of the queries. The Web API gives no responses and won't close the HTTP connection until I kill the script manually.

What is the expected behavior

All the queries should be completed normally.

Steps to reproduce

  1. Add 500 torrents to qBittorrent.
  2. Run this script in Python 3
    This script and the qBittorrent should be run on the same computer. Using this script to access a qBittorrent in a remote host cannot reproduce this problem.
import requests
import time

host = 'http://127.0.0.1:8080' # Your Host
username = 'admin'             # Your Username
password = 'adminadmin'        # Your Password
cookies = ''

# Log in
cookies = requests.post(host+'/login', data={'username':username, 'password':password}, headers={'Referer': host}).cookies

# Get torrent list
for torrent in requests.get(host+'/query/torrents', cookies=cookies, headers={'Referer': host}).json():
    # Get the general properties
    requests.get(host+'/query/propertiesGeneral/'+torrent['hash'], cookies=cookies, headers={'Referer': host}).json()
    # Get the trackers
    requests.get(host+'/query/propertiesTrackers/'+torrent['hash'], cookies=cookies, headers={'Referer': host}).json()
    # Print the torrent name
    print(torrent['name'])
    # If you don't sleep for a while, the Web API will be very likely to get stuck
    # time.sleep(0.01)
  1. If you don't get stuck, run the code a few more times.

Extra info(if any)

Uncomment the last line of the code above, the script works well.

@Kolcha

This comment has been minimized.

Copy link
Contributor

Kolcha commented Apr 15, 2019

I also faced similar issue some time ago... also when torrents count became above ~500.
I also use use python to query all torrents and some info about each torrent and sometimes my script stuck.
I made some tests and found that when requests.session object is used problem is gone. now I have ~1500 torrents and script querying its info works fine. so, as workaround try to use requests.session object

@jerrymakesjelly

This comment has been minimized.

Copy link
Author

jerrymakesjelly commented Apr 15, 2019

I also faced similar issue some time ago... also when torrents count became above ~500.
I also use use python to query all torrents and some info about each torrent and sometimes my script stuck.
I made some tests and found that when requests.session object is used problem is gone. now I have ~1500 torrents and script querying its info works fine. so, as workaround try to use requests.session object

It's nice. Thanks to @Kolcha, I have solved this problem by using requests.session too.
Here is my new code:

import requests
import time

host = 'http://127.0.0.1:8080' # Your Host
username = 'admin'             # Your Username
password = 'adminadmin'        # Your Password

sess = requests.Session()
sess.headers.update({'Referer': host})

# Log in
sess.post(host+'/login', data={'username':username, 'password':password})

# Get torrent list
for torrent in sess.get(host+'/query/torrents').json():
    # Get the general properties
    sess.get(host+'/query/propertiesGeneral/'+torrent['hash']).json()
    # Get the trackers
    sess.get(host+'/query/propertiesTrackers/'+torrent['hash']).json()
    # Print the torrent name
    print(torrent['name'])

As mentioned in requests document, the requests.session uses urllib3's connection pool to reuse TCP connections and improve performance, so I think this problem may cause of too many connections were created in a short time.

Chocobo1 added a commit to Chocobo1/qBittorrent that referenced this issue Apr 15, 2019

Remove closed connections immediately
Previously it relied on a timer to drop dead connections but that proved to
be too slow when there is a incomming burst of connections.

Fixes qbittorrent#10487.

Chocobo1 added a commit to Chocobo1/qBittorrent that referenced this issue Apr 15, 2019

Remove closed connections immediately
Previously it relied on a timer to drop dead connections but that proved to
be too slow when there is a incoming burst of connections.

Fixes qbittorrent#10487.
@Chocobo1

This comment has been minimized.

Copy link
Member

Chocobo1 commented Apr 15, 2019

@jerrymakesjelly
I've submitted a patch #10492, would you please try it out if possible? I don't have that much torrent to test locally.
That PR is targeting master branch but you should be able to cherry-pick to v4_1_x easily.

Chocobo1 added a commit to Chocobo1/qBittorrent that referenced this issue Apr 15, 2019

Remove closed connections immediately
Previously it relied on a timer to drop dead connections but that proved to
be too slow when there is an incoming burst of connections.

Fixes qbittorrent#10487.

Chocobo1 added a commit to Chocobo1/qBittorrent that referenced this issue Apr 15, 2019

Remove closed connections immediately
Previously it relied on a timer to drop dead connections but that proved to
be too slow when there is an incoming burst of connections.

Fixes qbittorrent#10487.
@Kolcha

This comment has been minimized.

Copy link
Contributor

Kolcha commented Apr 15, 2019

@Chocobo1 , confirmed, #10492 fixed this issue.
I can use my old script (without requests.session object), I run it ~10 times requesting info for ~1500 torrents. Everything I got - just few very short time (< 1 sec) freezes.
Without mentioned fix, my script stuck almost each time I run.

@Chocobo1

This comment has been minimized.

Copy link
Member

Chocobo1 commented Apr 16, 2019

@Kolcha
Thanks for testing it! Now I can merge it with confidence.

Chocobo1 added a commit to Chocobo1/qBittorrent that referenced this issue Apr 16, 2019

Remove closed connections immediately
Previously it relied on a timer to drop dead connections but that proved to
be too slow when there is an incoming burst of connections.

Fixes qbittorrent#10487.
@jerrymakesjelly

This comment has been minimized.

Copy link
Author

jerrymakesjelly commented Apr 16, 2019

Sorry for late. I have also tested and it shows that #10492 fixed this issue.

It seems that the previous API is deprecated in v4.2.0 alpha so I used APIv2 to rewrite the script without requests.session. I ran the this script about 5 times, two clients were having the same torrents (above 500). In my test, every time I ran the script in v4.1.5 it got stuck, but there is no problem in the new version with your patch.

Thanks to @Chocobo1 !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.