Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
commit every 200 scan results to the database an error would make the…
… scan of big libraries impossible
- Loading branch information
7d2a204
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#419
7d2a204
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you commit, you need to begin transaction again. This just kills performance and throws exception when next commit is due.
Also 200 seems like being way too careful. psy-q had misconfigured server that used 128MB instead of 512MB RAM (NC minimum) and got into high thousands. I'll do some tests with 128 and let you know safe threshold.
7d2a204
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok. Will fix my mistake. Worked for me but might be related to sqlite on the test-Nc.
I think its not only a memory threshold issue. I had several issues where users had faulty files which could not be caught by getID3. Or php timeouts; or issues with SMB.
The workaround was „do it several times“
Why do you think eg 200 is bad?
For a 500track libraries it reduces DB operations to 3. compared you your suggested 1. compared to my old coding of ~600.
Or do I miss something?
Your suggestion with the commit was perfect...
7d2a204
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
2 more ideas:
completely without I don´t feel comfortable somehow