Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_integration_downloader_aware_priority_queue raises exception #4644

Closed
Lukas0907 opened this issue Jun 23, 2020 · 0 comments · Fixed by #4645
Closed

test_integration_downloader_aware_priority_queue raises exception #4644

Lukas0907 opened this issue Jun 23, 2020 · 0 comments · Fixed by #4645
Labels

Comments

@Lukas0907
Copy link
Contributor

Lukas0907 commented Jun 23, 2020

Description

The test tests/test_scheduler.py::TestIntegrationWithDownloaderAwareInMemory::test_integration_downloader_aware_priority_queue raises the following exception:

ERROR    scrapy.core.engine:engine.py:308 Stats close failure
Traceback (most recent call last):
  File "/home/lukas/Projects/3rd/scrapy/.tox/py/lib/python3.8/site-packages/twisted/internet/defer.py", line 654, in _runCallbacks
    current.result = callback(current.result, *args, **kw)
  File "/home/lukas/Projects/3rd/scrapy/scrapy/core/engine.py", line 328, in <lambda>
    dfd.addBoth(lambda _: self.crawler.stats.close_spider(spider, reason=reason))
  File "/home/lukas/Projects/3rd/scrapy/scrapy/statscollectors.py", line 48, in close_spider
    self._persist_stats(self._stats, spider)
  File "/home/lukas/Projects/3rd/scrapy/scrapy/statscollectors.py", line 61, in _persist_stats
    self.spider_stats[spider.name] = stats
TypeError: unhashable type: 'list'

The exception is only logged by scrapy.core.engine and hence swallowed by py.test. I discovered it accidentally when using the log_cli = true pytest setting.

Steps to Reproduce

  1. Set log_cli = true in pytest.ini
  2. Run tox -e py -- tests/test_scheduler.py -k test_integration_downloader_aware_priority_queue

Versions

Scrapy 2.1.0 / master

Additional context

The issue should be simple to fix, I will prepare a PR for it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants