Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

无法登录,KeyError #91

Closed
Joker-zc opened this issue Apr 17, 2018 · 8 comments
Closed

无法登录,KeyError #91

Joker-zc opened this issue Apr 17, 2018 · 8 comments

Comments

@Joker-zc
Copy link

用的1.7.2系统是deepin,用的虚拟环境。密码是正确的。

2018-04-17 21:44:32,720: INFO/MainProcess] Received task: tasks.login.login_task[e8e67e00-f0f2-4131-82a4-bc313dac75de]
[2018-04-17 21:44:32,827: ERROR/ForkPoolWorker-1] Task tasks.login.login_task[e8e67e00-f0f2-4131-82a4-bc313dac75de] raised unexpected: KeyError('showpin',)
Traceback (most recent call last):
File "/homen_gu/Desktop/weibospider-master/.envb/python3.5/site-packages/celery/app/trace.py", line 374, in trace_task
R = retval = fun(*args, **kwargs)
File "/homen_gu/Desktop/weibospider-master/.envb/python3.5/site-packages/celery/app/trace.py", line 629, in protected_call
return self.run(*args, **kwargs)
File "/homen_gu/Desktop/weibospider-master/tasks/login.py", line 12, in login_task
get_session(name, password)
File "/homen_gu/Desktop/weibospider-master/login/login.py", line 228, in get_session
url, yundama_obj, cid, session = do_login(name, password, proxy)
File "/homen_gu/Desktop/weibospider-master/login/login.py", line 206, in do_login
if server_data['showpin']:
KeyError: 'showpin'

@ResolveWang
Copy link
Member

可能是 if server_data['showpin']: 这个语句的问题

替换成下面代码试试呢?
if server_data.get('showpin', None)

@Joker-zc
Copy link
Author

好了,谢谢大佬,不过执行搜索后又有报错了。。
[2018-04-18 16:20:27,986: INFO/MainProcess] Received task: tasks.search.search_keyword[27f9ef57-4963-4641-883b-f27f866f0f47]
2018-04-18 16:20:27 - crawler - INFO - We are searching keyword "快手"
[2018-04-18 16:20:27,989: INFO/ForkPoolWorker-1] We are searching keyword "快手"
2018-04-18 16:20:27 - crawler - INFO - the crawling url is http://s.weibo.com/weibo/%E5%BF%AB%E6%89%8B&scope=ori&suball=1&page=1
[2018-04-18 16:20:27,990: INFO/ForkPoolWorker-1] the crawling url is http://s.weibo.com/weibo/%E5%BF%AB%E6%89%8B&scope=ori&suball=1&page=1
2018-04-18 16:20:27 - crawler - ERROR - failed to crawl http://s.weibo.com/weibo/%E5%BF%AB%E6%89%8B&scope=ori&suball=1&page=1,here are details:'NoneType' object is not subscriptable, stack is File "/homen_gu/Desktop/weibospider-master/decorators/decorators.py", line 17, in time_limit
return func(*args, **kargs)

[2018-04-18 16:20:27,996: ERROR/ForkPoolWorker-1] failed to crawl http://s.weibo.com/weibo/%E5%BF%AB%E6%89%8B&scope=ori&suball=1&page=1,here are details:'NoneType' object is not subscriptable, stack is File "/homen_gu/Desktop/weibospider-master/decorators/decorators.py", line 17, in time_limit
return func(*args, **kargs)

2018-04-18 16:20:27 - crawler - WARNING - No search result for keyword 快手, the source page is
[2018-04-18 16:20:27,998: WARNING/ForkPoolWorker-1] No search result for keyword 快手, the source page is
[2018-04-18 16:20:27,998: INFO/ForkPoolWorker-1] Task tasks.search.search_keyword[27f9ef57-4963-4641-883b-f27f866f0f47] succeeded in 0.009885783000072479s: None

@ResolveWang
Copy link
Member

ResolveWang commented Apr 19, 2018

检查一下你的redis中是否有cookies,然后手动测试一下,确认你的账号是否可以用于搜索

@Joker-zc
Copy link
Author

redis里没有cooikes,
(WeiboSpider)lin_gu@ww:~/Desktop/weibospider-master$ ./redis-3.2.9/src/redis-cli
127.0.0.1:6379> auth weibospider
OK
127.0.0.1:6379> keys *
(empty list or set)
127.0.0.1:6379>

帐号高级搜索是可以用的。
配置如下
redis:
host: 127.0.0.1
port: 6379
password: 'weibospider'
cookies: 1 # store and fetch cookies
# store fetched urls and results,so you can decide whether retry to crawl the urls or not
urls: 2
broker: 5 # broker for celery
backend: 6 # backed for celery
id_name: 8 # user id and names,for repost info analysis. Could be safely deleted after repost tasks
# expire_time (hours) for redis db2, if they are useless to you, you can set the value smaller
expire_time: 48
# redis sentinel for ha. if you neet it, just add sentinel host and port below the sentinel args,like this:
###############################
#sentinel: #
# - host: 2.2.2.2 #
# port: 26379 #
# - host: 3.3.3.3 #
# port: 26379 #
# #
###############################
sentinel: ''
master: '' # redis sentinel master name, if you don't need it, just set master: ''
socket_timeout: 5 # sockt timeout for redis sentinel, if you don't need it, just set master: ''

@ResolveWang
Copy link
Member

你确定你用的1.7.2?貌似1.7.2默认不是采用

###############################
#sentinel: #
# - host: 2.2.2.2 #
# port: 26379 #
# - host: 3.3.3.3 #
# port: 26379 #
# #
###############################

这个注释风格的。

login.pyget_session函数的return session之前打印一下当前cookies呢?

Cookies.store_cookies(name, session.cookies.get_dict())
print(session.cookies.get_dict())   # 加这句话,然后观察登录的时候是否有cookies打印出来
return session

@Joker-zc
Copy link
Author

Joker-zc commented Apr 19, 2018

我按照这个lssues改了之后可以登录并打印出cookies,但是redis里没有cookies
注释风格应该是拷贝的问题。。。我在另一台电脑上用QQ拷过来的。。

@ResolveWang
Copy link
Member

1.7.2的配置文件不长这个样。要不重新在releases中下载稳定版的代码跑跑?

或者你懂Python的话,调试一下redis_db.py吧,看看是不是哪里有问题

@Joker-zc
Copy link
Author

嗯,我重新下一个跑跑吧,谢谢大佬啦~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants