Open
Description
When running locally, Crawlee thinks "memory is overloaded" means "stuff that we own takes up more memory than some configured part of the total system memory" - if our calculation of the used memory fails to take something into account, we could easily smother the system.
We should have some safety mechanism that would say the system is overloaded if the system memory has something like >95% utilization (by any processes, not just current crawlee process), regardless of CRAWLEE_AVAILABLE_MEMORY_RATIO
? shouldn't we?
Originally posted by @janbuchar in #1210 (comment)