-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API: add scrape
endpoint.
#725
Comments
josecelano
added a commit
to josecelano/torrust-tracker
that referenced
this issue
Mar 11, 2024
The torrents endppint allow getting a list of torrents provifing the infohashes: http://127.0.0.1:1212/api/v1/torrents?token=MyAccessToken&info_hash=9c38422213e30bff212b30c360d26f9a02136422&info_hash=2b66980093bc11806fab50cb3cb41835b95a0362 It's like the tracker "scrape" request. The response JSON is the same as the normal torrent list: ```json [ { "info_hash": "9c38422213e30bff212b30c360d26f9a02136422", "seeders": 1, "completed": 0, "leechers": 0 }, { "info_hash": "2b66980093bc11806fab50cb3cb41835b95a0362", "seeders": 1, "completed": 0, "leechers": 0 } ] ```
In the end, I decided to use the [
{
"info_hash": "9c38422213e30bff212b30c360d26f9a02136422",
"seeders": 1,
"completed": 0,
"leechers": 0
},
{
"info_hash": "2b66980093bc11806fab50cb3cb41835b95a0362",
"seeders": 1,
"completed": 0,
"leechers": 0
}
] I've only added a new query parameter |
josecelano
added a commit
to josecelano/torrust-tracker
that referenced
this issue
Mar 11, 2024
The torrents endppint allow getting a list of torrents provifing the infohashes: http://127.0.0.1:1212/api/v1/torrents?token=MyAccessToken&info_hash=9c38422213e30bff212b30c360d26f9a02136422&info_hash=2b66980093bc11806fab50cb3cb41835b95a0362 It's like the tracker "scrape" request. The response JSON is the same as the normal torrent list: ```json [ { "info_hash": "9c38422213e30bff212b30c360d26f9a02136422", "seeders": 1, "completed": 0, "leechers": 0 }, { "info_hash": "2b66980093bc11806fab50cb3cb41835b95a0362", "seeders": 1, "completed": 0, "leechers": 0 } ] ```
josecelano
added a commit
to josecelano/torrust-tracker
that referenced
this issue
Mar 11, 2024
The torrents endppint allow getting a list of torrents provifing the infohashes: http://127.0.0.1:1212/api/v1/torrents?token=MyAccessToken&info_hash=9c38422213e30bff212b30c360d26f9a02136422&info_hash=2b66980093bc11806fab50cb3cb41835b95a0362 It's like the tracker "scrape" request. The response JSON is the same as the normal torrent list: ```json [ { "info_hash": "9c38422213e30bff212b30c360d26f9a02136422", "seeders": 1, "completed": 0, "leechers": 0 }, { "info_hash": "2b66980093bc11806fab50cb3cb41835b95a0362", "seeders": 1, "completed": 0, "leechers": 0 } ] ```
josecelano
added a commit
to josecelano/torrust-tracker
that referenced
this issue
Mar 11, 2024
The torrents endppint allow getting a list of torrents provifing the infohashes: http://127.0.0.1:1212/api/v1/torrents?token=MyAccessToken&info_hash=9c38422213e30bff212b30c360d26f9a02136422&info_hash=2b66980093bc11806fab50cb3cb41835b95a0362 It's like the tracker "scrape" request. The response JSON is the same as the normal torrent list: ```json [ { "info_hash": "9c38422213e30bff212b30c360d26f9a02136422", "seeders": 1, "completed": 0, "leechers": 0 }, { "info_hash": "2b66980093bc11806fab50cb3cb41835b95a0362", "seeders": 1, "completed": 0, "leechers": 0 } ] ```
josecelano
added a commit
that referenced
this issue
Mar 11, 2024
…list endpoint d39bfc2 feat: [#725] API. Add scrape filter to torrents endpoint (Jose Celano) 4b24256 chore(deps): add cargo dependency: axum-extra (Jose Celano) Pull request description: API endpoint: http://127.0.0.1:1212/api/v1/torrents?token=MyAccessToken&info_hash=9c38422213e30bff212b30c360d26f9a02136422&info_hash=2b66980093bc11806fab50cb3cb41835b95a0362 Added a new query parameter `info_hash` which is an array. You can specify directly the list of torrents you want to get. The JSON result is the same: ```json [ { "info_hash": "9c38422213e30bff212b30c360d26f9a02136422", "seeders": 1, "completed": 0, "leechers": 0 }, { "info_hash": "2b66980093bc11806fab50cb3cb41835b95a0362", "seeders": 1, "completed": 0, "leechers": 0 } ] ``` It contains torrent's stats. ACKs for top commit: josecelano: ACK d39bfc2 Tree-SHA512: 74414200e4d22035cbffd77fa21437551806c671b06a5787211224242627c1f6e95e22d8709f4c7157b1dfeeb1614237eee0fdf9a0dbfb3a41a3876d924c5f96
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Relates to: #708
I would like to improve the importation of statistics in the Index.
Currently, the Index is getting statistics from a single torrent at the time using the endpoint to get the torrent info:
/torrent/info-hash
I would like to add a new endpoint to import statistics in batches.
/tracker/scrape?info-hash=3b245504cf5f11bbdbe1201cea6a6bf45aee1bc0,info-hash=...
You can specify up to 74 torrents at the same time.
The normal HTTP tracker response for a
scrape
request is (converted to JSON format):I think we can use a structure that is better for JSON, matching the Rust struct so that it's easier to generate the in-memory representation from the JSON object.
This is the proposed JSON format for the new
scrape
endpoint. It would follow thescrape
specification but in JSON.In Rust the struct would be:
Current implementation in the Index
The current struct used by the Index for the torrent details endpoint:
Notice, we are also importing the peer list and we don't need it in the Index.
Torrent endpoints
List of torrents:
http://127.0.0.1:1212/api/v1/torrents?token=MyAccessToken
Torrent info (the one currently used by the Index to import statistics):
http://127.0.0.1:1212/api/v1/torrent/090c6d4fb3a03191c4ef1fda6236ef0efb2d5c10?token=MyAccessToken
Extra considerations
This endpoint will be useful even if we decide to import statistics only for the torrents that are being loaded in the views in the Index. See torrust/torrust-index#469 (comment). Becuase, in the torrent list page the Index also shows the seeders and leechers:
So we need to get statistics for more than one torrent at the same time (for each result page)
The text was updated successfully, but these errors were encountered: