New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CPU is completely utilized #34
Comments
Hi @MoroseCorpse, this is probably the fault of You should probably turn down the number of concurrent tasks for the encoder. By default, (as at 6d8a90b see You can override this with the $ freyr -z encoder=1 <QUERY> Or modify the {
...
"concurrency": {
"queries": 1,
"tracks": 1,
"trackStage": 6,
"downloader": 4,
- "encoder": 6,
+ "encoder": 1,
"embedder": 10
},
...
} |
I have changed the |
Can you inspect this with |
Clearly, the culprit seems to be You can turn concurrency down on that with the |
If you want to combine these specs for the concurrency in the cli flags, you can merge them like this $ freyr -z "encoder=1,trackStage=1" <QUERY> or like this
(as documented here) $
|
Okay thanks, that helped a lot. Are now "only" 50% utilization. But individual cores still pull up to 100%. If I give my VM only one core, it is completely overloaded. For my server this may not be so important, but I would like to intriegieren your tool in a project of me that should also work on worse servers. I have now also everything already set to 1 and are still 100% utilized at a core:
|
That seems quite problematic, please include a screenshot of |
Well from that, it's clear that the brunt of the work is solely coming from That seems justified, although Maybe #21 can help in that, by using |
Closing this for now, seeing as the issue isn't from freyr itself. |
When I want to download a complete playlist, my server's CPU is completely maxed out. I have 4 cores assigned to my VM and they are almost always all at 100% utilization. So far I have only tested the Spotify playlist download. I also already turned down the parallel downloads in the config and the result was the same
The text was updated successfully, but these errors were encountered: