-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calibre Metadata Scan only finds 1 book (Kobo Clara HD) #9016
Comments
The issue doesn't happen with the recursive search for libraries. It happens parsing the contents of a library. The entire In any case you can switch to the manual parser by editing this line: https://github.com/koreader/koreader/blob/master/plugins/calibre.koplugin/metadata.lua#L57 Just put a size (in bytes) lower than the size of your file and perform the search again. If you do so then please tell us if the data that you get with it is ok or some stuff is broken. |
https://github.com/grafi-tt/lunajson supports incremental parsing, so using it instead of the manual parser will make it 100% reliable without any OOM issues. Sadly it is much slower than rapidjson. |
Thanks for the quick response! The metadata.calibre file is 38.2MB.
to
There was no change. Same error occurs. How can I try the https://github.com/grafi-tt/lunajson ? If you can give me some basic instructions, I will give it a try. |
Umm. Then the parser seems broken.
Isn't really a suggestion for you as the SAX like API is not implemented. I just pointed as a way to finally fix the issue in the future. But you can try the other way around.
Just to be sure: you need to restart the program after editing the file. |
Success! For anyone else who comes across this situation, the workaround which worked to enable the Calibre Metadata Search was changing this line in from: local MAX_JSON_FILESIZE = 30 * 1000 * 1000 to local MAX_JSON_FILESIZE = 40 * 1000 * 1000 Hope this will be helpful to others. PS This is obviously a "quick and dirty" workaround. Thank you Pazos for your help with this. All I can say is, "koreader.rocks". |
This works because your device happens to have enough RAM to deal with this file w/ rapidjson ;o). |
That's hardly unusual though, at least on Kobo devices with 512 MB RAM, give or take however filled the cache is at any given moment. :-) |
Yup, just throwing it out there for poor Kindle users, where this would blow up in fun and hilarious ways ;o). |
@NiLuJe @Frenzie: I'm going to give it a try with https://github.com/grafi-tt/lunajson and replace the whole Since we already have two json modules and there's no reason to add a third one I'm going to add it directly into the plugin, unless you see a reason to make it available from elsewhere. |
What concretely is the reason to use that one instead of one of the others? |
it makes possible to write a set of lua functions and pass them as callbacks that will be executed while traversing the json object. The other ones require to have the entire json object in memory (to be more precise they will traverse the whole json object making a big lua table). The SAX like API is really useful if: a - we're dealing with huge json files At the end of the day we're going to use the new module just like we use rapidjson now.
|
Alright, sounds fine to me as a plugin-specific lib. |
Issue
Have completely switched over to koreader from kobo/nickel and am enjoying the software very much.
One issue is with the search calibre metadata functionality (i.e. browse series), despite deleting and having calibre recreate the driveinfo.calibre and metadata.calibre files. The metadata in calibre exists and is in good order.
When I use the calibre->search settings -> manage libraries -> rescan disk for calibre libraries option, the resulting message says, "Found 1 calibre library with 1 books: 1: /mnt/onboard
The actual calibre library (and the device) holds 3261 books.
crash.log
(if applicable)The following crash log is of an actual crash, but repeating the attempt to scan the calibre libraries does not always generate the crash.
crash.log
Thanks for any insight into this issue.
The text was updated successfully, but these errors were encountered: