Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

如果spiderkeeper和scrpayd不在同一台机子上怎么连接呢? #54

Closed
ap1024 opened this issue Dec 29, 2017 · 3 comments
Closed

Comments

@ap1024
Copy link

ap1024 commented Dec 29, 2017

spiderkeeper是一台机子
scrapyd 有可能是台机子使用了nginx配置,
spiderkeeper怎么使用用户名和密码连接scrapyd呢?

@DormyMo
Copy link
Owner

DormyMo commented Dec 31, 2017

使用启动参数 --server
需要验证的scrapyd暂不支持

@ap1024
Copy link
Author

ap1024 commented Jan 9, 2018

我看链接方式你是单独写的接口,没有使用scrapyd api.
有一个api接口,可以使用用户名和密码链接的;
def init(self, target='http://localhost:6800', auth=None,
endpoints=None, client=None):
"""
Instantiates the ScrapydAPI wrapper for use.
Args:
target (str): the hostname/port to hit with requests.
auth (str, str): a 2-item tuple containing user/pass details. Only
used when client is not passed.
endpoints: a dictionary of custom endpoints to apply on top of
the pre-existing defaults.
client: a pre-instantiated requests-like client. By default, we use
our own client. Override for your own needs.
;
你这一版本非常不错,只是如果多台机子管理,最好是可以自定义服务器地址和用户名密码登陆。
这样就方便管理。
谢谢你的回复

@ileadall42
Copy link

这个很不错呀

@DormyMo DormyMo closed this as completed Jul 2, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants