Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

spiderkeeper不能识别爬虫 #87

Closed
Ericliu68 opened this issue Oct 18, 2018 · 13 comments
Closed

spiderkeeper不能识别爬虫 #87

Ericliu68 opened this issue Oct 18, 2018 · 13 comments

Comments

@Ericliu68
Copy link

deploy成功,但是dash里面没有爬虫,scrapyd没有报错,但是还是处于上一个project中,spiderkeeper也没有报错

@QingGo
Copy link

QingGo commented Oct 19, 2018

请问一下你一共创建了多少个项目?我的体验是,项目一多了(十个以上),spiderkeeper同步scrapyd的状态就需要特别长时间,甚至会一直卡住无法成功更新状态。

@Ericliu68
Copy link
Author

Ericliu68 commented Oct 19, 2018 via email

@QingGo
Copy link

QingGo commented Oct 19, 2018

我不是作者,只是一个使用者。关于调度问题我觉得是因为scrapyd不能在短时间处理两个任务请求,而spiderkeeper也没有重试机制,为此我提了一个PR你可以参考一下https://github.com/DormyMo/SpiderKeeper/pull/85。关于不能识别爬虫,也有一个可能是爬虫本身代码有问题,你可以用scrapyd提供的list spider api试一试能不能返回正确的结果。如果一切正常,那就只能把spiderkeeper的日志等级调为debug再好好看看日志输出了。

@Ericliu68
Copy link
Author

Ericliu68 commented Oct 19, 2018 via email

@zhangyucha0
Copy link

zhangyucha0 commented Oct 22, 2018

现在egg文件上传失败是不会提示的(你可以看下源码,SUCCESS的提示是前端写死的)

有个小坑——server末尾是/的话就会不能上传,比如:http://localhost:6800/

上传提示这个问题我在自己fork的项目中修复了。另外还支持了:

  • web页面添加server
  • 实时显示server连通状态

有兴趣的话可以直接clone下来从源码安装

@Ericliu68
Copy link
Author

Ericliu68 commented Oct 22, 2018 via email

@Ericliu68
Copy link
Author

Ericliu68 commented Oct 22, 2018 via email

@zhangyucha0
Copy link

请问是哪个分支

Sent from my iPhone
On Oct 22, 2018, at 16:26, zhangyucha0 @.***> wrote: 现在egg文件上传失败是不会提示的(你可以看下源码,SUCCESS的提示是前端写死的) 有个小坑——server末尾是/的话就会不能上传,比如:http://localhost:6800/ 上传提示这个问题我在自己fork的项目中修复了。另外还支持了: web页面添加server 实时显示server连通状态 有兴趣的话可以直接clone下来从源码安装 — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

master

@Ericliu68
Copy link
Author

Ericliu68 commented Oct 23, 2018 via email

@luzihang123
Copy link

spiderkeeper不能识别爬虫可能是scrapyd的环境里,缺少相关的依赖包,需要安装一下

@peipppp
Copy link

peipppp commented Apr 30, 2019

遇到同样的问题,后台运行spiderkeeper,就会出现这种情况,直接运行的话,就没有这个问题

@yoiyang
Copy link

yoiyang commented Jul 12, 2019

试过直接运行也会出现这个问题,我的解决方法是在本spiderkeeper运行的时候,新开一个窗口,在scrapy.cfg的同一个目录里,运行:
scrapyd-deploy

@Ericliu68
Copy link
Author

试过直接运行也会出现这个问题,我的解决方法是在本spiderkeeper运行的时候,新开一个窗口,在scrapy.cfg的同一个目录里,运行:
scrapyd-deploy

谢谢,我现在知道我这边的问题了,一般都是自己代码里面像redis和数据库有问题,再上传的时候没有任何提示,scrapyd那边会有错误显示,而spiderkeeper不会显示出来

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants