We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
作者你好, 请问spiderkeeper只能同时运行4个爬虫吗? 如果我想运行更多的爬虫项目,能否在设置中进行更改?
The text was updated successfully, but these errors were encountered:
I have found the solutions
max_proc_per_cpu The maximum number of concurrent Scrapy process that will be started per cpu. Defaults to 4.
Sorry, something went wrong.
恩配置scrpayd的配置文件即可
Merge pull request DormyMo#31 from c-lorand/2020-01-29_log-files
8996935
Changed the way we open log files
No branches or pull requests
作者你好, 请问spiderkeeper只能同时运行4个爬虫吗? 如果我想运行更多的爬虫项目,能否在设置中进行更改?
The text was updated successfully, but these errors were encountered: