Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Breaks with Scrapy 2.11 #152

Closed
runa opened this issue Sep 18, 2023 · 4 comments · Fixed by #153
Closed

Breaks with Scrapy 2.11 #152

runa opened this issue Sep 18, 2023 · 4 comments · Fixed by #153

Comments

@runa
Copy link

runa commented Sep 18, 2023

After upgrading to Scrapy 2.11 (from 2.10) my scrapyrt API stopped working:

2023-09-18 21:50:50+0000 [scrapyrt] Unhandled Error
	Traceback (most recent call last):
	  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapy/crawler.py", line 265, in crawl
	    return self._crawl(crawler, *args, **kwargs)
	  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapy/crawler.py", line 269, in _crawl
	    d = crawler.crawl(*args, **kwargs)
	  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/twisted/internet/defer.py", line 1947, in unwindGenerator
	    return _cancellableInlineCallbacks(gen)
	  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/twisted/internet/defer.py", line 1857, in _cancellableInlineCallbacks
	    _inlineCallbacks(None, gen, status, _copy_context())
	--- <exception caught here> ---
	  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/twisted/internet/defer.py", line 1697, in _inlineCallbacks
	    result = context.run(gen.send, result)
	  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapyrt/core.py", line 42, in crawl
	    self.engine = self._create_engine()
	  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapy/crawler.py", line 172, in _create_engine
	    return ExecutionEngine(self, lambda _: self.stop())
	  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapy/core/engine.py", line 89, in __init__
	    assert crawler.logformatter
	builtins.AssertionError:
	
Unhandled Error
Traceback (most recent call last):
  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapy/crawler.py", line 265, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapy/crawler.py", line 269, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/twisted/internet/defer.py", line 1947, in unwindGenerator
    return _cancellableInlineCallbacks(gen)
  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/twisted/internet/defer.py", line 1857, in _cancellableInlineCallbacks
    _inlineCallbacks(None, gen, status, _copy_context())
--- <exception caught here> ---
  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/twisted/internet/defer.py", line 1697, in _inlineCallbacks
    result = context.run(gen.send, result)
  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapyrt/core.py", line 42, in crawl
    self.engine = self._create_engine()
  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapy/crawler.py", line 172, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "/layers/google.python.pip/pip/lib/python3.11/site-packages/scrapy/core/engine.py", line 89, in __init__
    assert crawler.logformatter
builtins.AssertionError:
@michaelbrunnbauer
Copy link

same here

@pawelmhm
Copy link
Member

pawelmhm commented Sep 19, 2023

thanks for report, some backward incompatible changes in scrapy crwaler object: https://docs.scrapy.org/en/latest/news.html#backward-incompatible-changes will check that

@runa
Copy link
Author

runa commented Sep 19, 2023 via email

@pawelmhm
Copy link
Member

released to pypi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants