We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
当我输入 scrapy crawl HotSearchSpider 时,出现了 builtins.TypeError: expected string or bytes-like object
scrapy crawl HotSearchSpider
builtins.TypeError: expected string or bytes-like object
(VirtualEnv) D:\Documents\HCI\Sec 4\CSC\Research Paper\WeiboSpider-master\WeiboSpider>scrapy crawl HotSearchSpider Unhandled error in Deferred: Traceback (most recent call last): File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\crawler.py", line 172, in crawl return self._crawl(crawler, *args, **kwargs) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\crawler.py", line 176, in _crawl d = crawler.crawl(*args, **kwargs) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\twisted\internet\defer.py", line 1656, in unwindGenerator return _cancellableInlineCallbacks(gen) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\twisted\internet\defer.py", line 1571, in _cancellableInlineCallbacks _inlineCallbacks(None, g, status) --- <exception caught here> --- File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\twisted\internet\defer.py", line 1445, in _inlineCallbacks result = current_context.run(g.send, result) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\crawler.py", line 80, in crawl self.engine = self._create_engine() File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\crawler.py", line 105, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\core\engine.py", line 69, in __init__ self.downloader = downloader_cls(crawler) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\core\downloader\__init__.py", line 88, in __init__ self.middleware = DownloaderMiddlewareManager.from_crawler(crawler) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\middleware.py", line 53, in from_crawler return cls.from_settings(crawler.settings, crawler) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\middleware.py", line 35, in from_settings mw = create_instance(mwcls, settings, crawler) File "d:\documents\hci\sec 4\csc\research paper\weibospider-master\virtualenv\lib\site-packages\scrapy\utils\misc.py", line 140, in create_instance return objcls.from_crawler(crawler, *args, **kwargs) File "D:\Documents\HCI\Sec 4\CSC\Research Paper\WeiboSpider-master\WeiboSpider\WeiboSpider\middlewares.py", line 68, in from_crawler ip_num = int(re.findall(r'count=\d+', api)[0][6:]) File "C:\Users\RJ008\.pyenv\pyenv-win\versions\3.9.0\lib\re.py", line 241, in findall return _compile(pattern, flags).findall(string) builtins.TypeError: expected string or bytes-like object
请问这问题怎么解决呢?谢谢。
The text was updated successfully, but these errors were encountered:
请将setttings.py中第63行'WeiboSpider.middlewares.RetryMiddleware': 544,改为'WeiboSpider.middlewares.RetryMiddleware': None,,即禁用RetryMiddleware,这个中间件有点问题,最近太忙了没时间改,后续会陆续跟进,也可以直接提PR。
setttings.py
'WeiboSpider.middlewares.RetryMiddleware': 544,
'WeiboSpider.middlewares.RetryMiddleware': None,
RetryMiddleware
Sorry, something went wrong.
No branches or pull requests
当我输入
scrapy crawl HotSearchSpider
时,出现了builtins.TypeError: expected string or bytes-like object
请问这问题怎么解决呢?谢谢。
The text was updated successfully, but these errors were encountered: