Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

同时可运行的爬虫数量 #31

Closed
Lving opened this issue Jul 13, 2017 · 2 comments
Closed

同时可运行的爬虫数量 #31

Lving opened this issue Jul 13, 2017 · 2 comments

Comments

@Lving
Copy link

Lving commented Jul 13, 2017

作者你好, 请问spiderkeeper只能同时运行4个爬虫吗? 如果我想运行更多的爬虫项目,能否在设置中进行更改?

@Lving
Copy link
Author

Lving commented Jul 14, 2017

I have found the solutions

max_proc_per_cpu
The maximum number of concurrent Scrapy process that will be started per cpu. Defaults to 4.

@DormyMo
Copy link
Owner

DormyMo commented Jul 25, 2017

恩配置scrpayd的配置文件即可

@DormyMo DormyMo closed this as completed Jul 25, 2017
Zephyrrus pushed a commit to Zephyrrus/UBGuardian that referenced this issue Oct 25, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants