Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

manual import of hashes fails #10

Closed
emhl opened this issue Oct 5, 2023 · 8 comments
Closed

manual import of hashes fails #10

emhl opened this issue Oct 5, 2023 · 8 comments

Comments

@emhl
Copy link

emhl commented Oct 5, 2023

Thank's for creating this awesome tool.
I've tried to run your example script to import the bittorrent hashes from rarbg, but curl always fails with the following output:

$ sqlite3 -json -batch rarbg_db.sqlite "$(cat rarbg-import.sql)"   | jq -r --indent 0 '.[] | . * { source: "rarbg" } | . + if .imdb != null then { externalIds: { imdb: .imdb } } else {} end | del(.imdb) | del(..|nulls)'   | curl --verbose -H "Content-Type: application/json" --data-binary @- http://localhost:3333/import
*   Trying 127.0.0.1:3333...
* Connected to localhost (127.0.0.1) port 3333 (#0)
> POST /import HTTP/1.1
> Host: localhost:3333
> User-Agent: curl/7.81.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 269567440
> Expect: 100-continue
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 100 Continue
* Send failure: Broken pipe
* Closing connection 0
curl: (55) Send failure: Broken pipe

bitmagnet shows the following logs, before restarting:

github.com/bitmagnet-io/bitmagnet/internal/importer.(*activeImport).flushLocked(0xc001923c80)
	/build/internal/importer/importer.go:187 +0x37
github.com/bitmagnet-io/bitmagnet/internal/importer.(*activeImport).buffer(_, {{0xc0014fe800, 0x5}, {0x0, 0x5, 0x7a, 0xef, 0x5b, 0x56, 0x3b, ...}, ...})
	/build/internal/importer/importer.go:173 +0x185
created by github.com/bitmagnet-io/bitmagnet/internal/importer.(*activeImport).run.func1 in goroutine 23879
	/build/internal/importer/importer.go:158 +0x245

the Json data (with the hashes removed) generated before it get's piped into curl seems reasonable:

{"infoHash":"0000000000000000000000000000000000000000","name":"Daybreak.1993.1080p.AMZN.WEBRip.DDP2.0.x264-BTW","size":5681184768,"contentType":"movie","publishedAt":"2019-10-11T16:32:58.000Z","source":"rarbg","externalIds":{"imdb":"tt0106676"}}
{"infoHash":"0000000000000000000000000000000000000000","name":"Dances.with.Wolves.1990.DC.20th.Anniversary.Edition.2.Discs.1080p.BluRay.AVC.DTS-HD.MA.7.1-FGT","size":62604181504,"contentType":"movie","videoSource":"BluRay","videoModifier":"BRDISK","publishedAt":"2015-05-24T00:35:37.000Z","source":"rarbg"}
{"infoHash":"0000000000000000000000000000000000000000","name":"American.Dad.S19E22.The.Grounch.1080p.DSNP.WEBRip.DDP5.1.x264-NTb[rartv]","size":610271232,"contentType":"tv_show","publishedAt":"2022-12-27T16:33:11.000Z","source":"rarbg","externalIds":{"imdb":"tt0397306"}}
{"infoHash":"0000000000000000000000000000000000000000","name":"Baldurs.Gate.3.v58649-GOG","size":97609842688,"contentType":"game","publishedAt":"2022-11-06T17:46:49.000Z","source":"rarbg"}
{"infoHash":"0000000000000000000000000000000000000000","name":"Operation.Mekong.2016.CHINESE.BRRip.XviD.MP3-VXT","size":1669332992,"contentType":"movie","videoCodec":"XviD","publishedAt":"2019-09-15T01:29:23.000Z","source":"rarbg","externalIds":{"imdb":"tt6044910"}}

manually sending a request with a single json object returns a sucess message, but the data doesn't seem to get added (unavailible in the web dashboard and there are no logs showing that anything is happening)

$ head -n1 rarbg_db.json | curl --verbose -H "Content-Type: application/json" --data-binary @- http://localhost:3333/import
*   Trying 127.0.0.1:3333...
* Connected to localhost (127.0.0.1) port 3333 (#0)
> POST /import HTTP/1.1
> Host: localhost:3333
> User-Agent: curl/7.81.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 198
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Vary: Origin
< Date: Thu, 05 Oct 2023 14:09:58 GMT
< Content-Length: 17
< Content-Type: text/plain; charset=utf-8
<
1 items imported
* Connection #0 to host localhost left intact
@mgdigital
Copy link
Collaborator

Thanks, let me look into this and get back to you. I haven't tried the importer in a little while and it definitely needs more work anyway....

@mgdigital mgdigital mentioned this issue Oct 5, 2023
@mgdigital
Copy link
Collaborator

This should be fixed in the latest release, could you give it another try?

@k4rli
Copy link

k4rli commented Oct 5, 2023

Perfect, thanks for the quick fix. I had the same issue and works with the latest update.

❯ sqlite3 -json -batch rarbg_db.sqlite "$(cat rarbg-import.sql)" \
  | jq -r --indent 0 '.[] | . * { source: "rarbg" } | . + if .imdb != null then { externalIds: { imdb: .imdb } } else {} end | del(.imdb) | del(..|nulls)' \
  | curl --verbose -H "Content-Type: application/json" --data-binary @- http://localhost:3333/import

*   Trying 127.0.0.1:3333...
* Connected to localhost (127.0.0.1) port 3333 (#0)
> POST /import HTTP/1.1
> Host: localhost:3333
> User-Agent: curl/7.88.1
> Accept: */*
> Content-Type: application/json
> Content-Length: 269567440
> Expect: 100-continue
> 
< HTTP/1.1 100 Continue
< HTTP/1.1 200 OK
< Vary: Origin
< Date: Thu, 05 Oct 2023 16:46:48 GMT
< Content-Type: text/plain; charset=utf-8
< Connection: close
< Transfer-Encoding: chunked
< 
1000 items imported
2000 items imported
3000 items imported
4000 items imported
5000 items imported
6000 items imported
7000 items imported
8000 items imported
....
497000 items imported
498000 items imported
499000 items imported
500000 items imported
....

@mgdigital
Copy link
Collaborator

Good to hear, thanks! Closing this now....

@bonny1992
Copy link

bonny1992 commented Nov 7, 2023

Hey, sorry to resurrect this issue but I have a similar problem.
I followed the example as well but I get this:

ERROR   httpserver/httpserver.go:52     error adding item       {"err": "encoding/hex: invalid byte: U+006D 'm'"}

I tried with multiple lines and with a single one.
Am I doing something wrong?

I am stupid, sorry.
I edited the sqlite when I got it as I added the "magnet" part of the infohashes, not remembering until I tried doing a single one without it.

@mgdigital
Copy link
Collaborator

Hi @bonny1992 this is working for me using 0.1.0. If you could redact any copyrighted material and post a line of JSON here I'll see if anything looks obviously wrong - please do not post any complete info hashes, just remove 1 character is fine.

@bonny1992
Copy link

Hi @bonny1992 this is working for me using 0.1.0. If you could redact any copyrighted material and post a line of JSON here I'll see if anything looks obviously wrong - please do not post any complete info hashes, just remove 1 character is fine.

No, I'm sorry, you're right. It's working fine.
I've edited my message (didn't expect you to reply this fast!), but I had the magnet:... part inside the infohash column, so it was failing because bitmagnet wasn't expecting a string that long (and not hexadecimal).

BTW, sorry to derail this issue, but is there any place where I can find what are the expected fields for an import? There's a very old csv I'd like to import (from an Italian tracker that closed some years ago) and for sure that doesn't have any imdb id :)

@mgdigital
Copy link
Collaborator

is there any place where I can find what are the expected fields for an import?

@bonny1992 I still need to add a proper schema for the import. You can see the data structure here: https://github.com/bitmagnet-io/bitmagnet/blob/main/internal/importer/importer.go#L44 - at a minimum you'll need source, infoHash, name, size, publishedAt - the rest are optional.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants