-
Notifications
You must be signed in to change notification settings - Fork 146
Closed
Description
I have one question about RAM usage.
My scrapy code is not executing anymore, but I am still able to see that RAM is being consumed by playwright.
What should I do to avoid all RAM to be consumed by paywirght even so that scraping process is not running any more? Should I find and kill playwright processes on Spider closed event?
I was having a lot of playwright timeouts during scraping. Maybe I experienced memory leaks?
Metadata
Metadata
Assignees
Labels
No labels