Navigation Menu

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid exceptions in is_generator_with_return_value #4935

Conversation

elacuesta
Copy link
Member

Closes #4477

Implementing @mdaniel's suggestion from #4477 (comment). Also, the function catches SyntaxErrors in order to prevent further errors like this one in the future.

@codecov
Copy link

codecov bot commented Jan 5, 2021

Codecov Report

Merging #4935 (5902cd2) into master (0dad0fc) will increase coverage by 0.04%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master    #4935      +/-   ##
==========================================
+ Coverage   88.15%   88.19%   +0.04%     
==========================================
  Files         162      162              
  Lines       10284    10288       +4     
  Branches     1499     1499              
==========================================
+ Hits         9066     9074       +8     
+ Misses        947      945       -2     
+ Partials      271      269       -2     
Impacted Files Coverage Δ
scrapy/utils/misc.py 97.72% <100.00%> (+1.63%) ⬆️
scrapy/core/downloader/__init__.py 92.48% <0.00%> (+1.50%) ⬆️

scrapy/utils/misc.py Outdated Show resolved Hide resolved
@Gallaecio Gallaecio added this to the 2.5 milestone Feb 21, 2021
@elacuesta
Copy link
Member Author

Full diff coverage reached 🙌

scrapy/utils/misc.py Outdated Show resolved Hide resolved
@Gallaecio
Copy link
Member

Suggestion for a message to log as a warning upon finding a SyntaxError:

Unable to determine whether or not <callable> is a generator with a return value. This will not prevent your code from working, but it prevents Scrapy from detecting potential issues in your implementation of <callable>. Please, report this in the Scrapy issue tracker (https://github.com/scrapy/scrapy/issues), including the code of <callable>.

@Gallaecio Gallaecio closed this Mar 12, 2021
@Gallaecio Gallaecio reopened this Mar 12, 2021
@elacuesta elacuesta closed this Mar 15, 2021
@elacuesta elacuesta reopened this Mar 15, 2021
@Gallaecio Gallaecio closed this Mar 16, 2021
@Gallaecio Gallaecio reopened this Mar 16, 2021
@elacuesta elacuesta closed this Mar 18, 2021
@elacuesta elacuesta reopened this Mar 18, 2021
@Gallaecio Gallaecio closed this Mar 19, 2021
@Gallaecio Gallaecio reopened this Mar 19, 2021
@elacuesta
Copy link
Member Author

elacuesta commented Mar 19, 2021

5902cd2 passes CI, but I don't understand why this seemingly harmless change breaks 🤷‍♀️ :

diff --git scrapy/utils/misc.py scrapy/utils/misc.py
index 5c986eed..8e97298b 100644
--- scrapy/utils/misc.py
+++ scrapy/utils/misc.py
@@ -242,6 +242,7 @@ def warn_on_generator_with_return_value(spider, callable):
     Logs a warning if a callable is a generator function and includes
     a 'return' statement with a value different than None
     """
+    callable_name = spider.__class__.__name__ + "." + callable.__name__
     try:
         if is_generator_with_return_value(callable):
             warnings.warn(
@@ -253,7 +254,6 @@ def warn_on_generator_with_return_value(spider, callable):
                 stacklevel=2,
             )
     except IndentationError:
-        callable_name = spider.__class__.__name__ + "." + callable.__name__
         warnings.warn(
             f'Unable to determine whether or not "{callable_name}" is a generator with a return value. '
             'This will not prevent your code from working, but it prevents Scrapy from detecting '

Any explanation is most welcome!

(edit)

Here's the full error log, in case anyone is interested
$ tox -e py38 -- tests/test_crawl.py                                       
GLOB sdist-make: /Users/eus/zyte/scrapy/setup.py
py38 inst-nodeps: /Users/eus/zyte/scrapy/.tox/.tmp/package/1/Scrapy-2.4.1.zip
py38 installed: apipkg==1.5,appnope==0.1.2,attrs==20.3.0,Automat==20.2.0,backcall==0.2.0,blessings==1.7,blinker==1.4,botocore==1.20.32,bpython==0.21,Brotli==1.0.9,brotlipy==0.7.0,certifi==2020.12.5,cffi==1.14.5,chardet==4.0.0,click==7.1.2,constantly==15.1.0,coverage==5.5,cryptography==2.9.2,cssselect==1.1.0,curtsies==0.3.5,cwcwidth==0.1.4,decorator==4.4.2,execnet==1.8.0,Flask==1.1.2,greenlet==1.0.0,h11==0.12.0,h2==3.2.0,hpack==3.0.0,hyperframe==5.2.0,hyperlink==21.0.0,idna==2.10,incremental==21.3.0,iniconfig==1.1.1,ipython==7.21.0,ipython-genutils==0.2.0,itemadapter==0.2.0,itemloaders==1.0.4,itsdangerous==1.1.0,jedi==0.18.0,Jinja2==2.11.3,jmespath==0.10.0,kaitaistruct==0.8,ldap3==2.7,lxml==4.6.2,MarkupSafe==1.1.1,mitmproxy==5.2,packaging==20.9,parsel==1.6.0,parso==0.8.1,passlib==1.7.4,pexpect==4.8.0,pickleshare==0.7.5,Pillow==8.1.2,pluggy==0.13.1,priority==1.3.0,prompt-toolkit==3.0.17,Protego==0.1.16,protobuf==3.11.3,ptyprocess==0.7.0,publicsuffix2==2.20191221,py==1.10.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pycparser==2.20,PyDispatcher==2.0.5,pyftpdlib==1.5.6,Pygments==2.8.1,PyHamcrest==2.0.2,pyOpenSSL==19.1.0,pyparsing==2.4.7,pyperclip==1.8.2,pytest==6.2.2,pytest-cov==2.11.1,pytest-forked==1.3.0,pytest-xdist==2.2.1,python-dateutil==2.8.1,pyxdg==0.27,queuelib==1.5.0,requests==2.25.1,ruamel.yaml==0.16.13,ruamel.yaml.clib==0.2.2,Scrapy @ file:///Users/eus/zyte/scrapy/.tox/.tmp/package/1/Scrapy-2.4.1.zip,service-identity==18.1.0,six==1.15.0,sortedcontainers==2.1.0,sybil==2.0.1,testfixtures==6.17.1,toml==0.10.2,tornado==6.1,traitlets==5.0.5,Twisted==20.3.0,urllib3==1.26.4,urwid==2.0.1,uvloop==0.15.2,w3lib==1.22.0,wcwidth==0.2.5,Werkzeug==1.0.1,wsproto==0.15.0,zope.interface==5.2.0,zstandard==0.13.0
py38 run-test-pre: PYTHONHASHSEED='3603536225'
py38 run-test: commands[0] | py.test --cov=scrapy --cov-report=xml --cov-report= tests/test_crawl.py
=========================================================================================== test session starts ============================================================================================
platform darwin -- Python 3.8.6, pytest-6.2.2, py-1.10.0, pluggy-0.13.1
cachedir: .tox/py38/.pytest_cache
rootdir: /Users/eus/zyte/scrapy, configfile: pytest.ini
plugins: cov-2.11.1, xdist-2.2.1, forked-1.3.0
collected 40 items                                                                                                                                                                                         

tests/test_crawl.py ..........FEFEFEFE.......sssssss..........x.                                                                                                                                     [100%]

================================================================================================== ERRORS ==================================================================================================
________________________________________________________________________ ERROR at teardown of CrawlTestCase.test_retry_conn_aborted ________________________________________________________________________
'NoneType' object is not iterable

During handling of the above exception, another exception occurred:
NOTE: Incompatible Exception Representation, displaying natively:

twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7fd2aa208730 [4.985111236572266s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7fd2aa20d340 [59.98607516288757s] called=0 cancelled=0 LoopingCall<60.0>(MemoryUsage.update, *(), **{})()>
<DelayedCall 0x7fd2aa234100 [59.98333215713501s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7fd2aa208ca0 [59.98459720611572s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7fd2aa21e2e0>,), **{})()>
------------------------------------------------------------------------------------------- Captured stderr call -------------------------------------------------------------------------------------------
Coverage.py warning: --include is ignored because --source is set (include-ignored)
Coverage.py warning: --include is ignored because --source is set (include-ignored)
________________________________________________________________________ ERROR at teardown of CrawlTestCase.test_retry_conn_failed _________________________________________________________________________
'NoneType' object is not iterable

During handling of the above exception, another exception occurred:
NOTE: Incompatible Exception Representation, displaying natively:

twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7fd2aa264970 [4.993923187255859s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7fd2aa264430 [59.99124884605408s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7fd2aa264910 [59.99353098869324s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7fd2aa1fe8b0>,), **{})()>
<DelayedCall 0x7fd2aa216b50 [59.99470520019531s] called=0 cancelled=0 LoopingCall<60.0>(MemoryUsage.update, *(), **{})()>
------------------------------------------------------------------------------------------- Captured stderr call -------------------------------------------------------------------------------------------
Coverage.py warning: --include is ignored because --source is set (include-ignored)
Coverage.py warning: --include is ignored because --source is set (include-ignored)
-------------------------------------------------------------------------------------------- Captured log call ---------------------------------------------------------------------------------------------
INFO     scrapy.crawler:crawler.py:59 Overridden settings:
{}
INFO     scrapy.extensions.telnet:telnet.py:55 Telnet Password: cf1c18ea157e22e4
INFO     scrapy.middleware:middleware.py:45 Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.logstats.LogStats']
_________________________________________________________________________ ERROR at teardown of CrawlTestCase.test_retry_conn_lost __________________________________________________________________________
'NoneType' object is not iterable

During handling of the above exception, another exception occurred:
NOTE: Incompatible Exception Representation, displaying natively:

twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7fd2aa28ed90 [4.992656707763672s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7fd2aa27ea90 [59.99085068702698s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7fd2aa25aa00 [59.99219727516174s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7fd2aa299520>,), **{})()>
<DelayedCall 0x7fd2aa27e2e0 [59.99367690086365s] called=0 cancelled=0 LoopingCall<60.0>(MemoryUsage.update, *(), **{})()>
------------------------------------------------------------------------------------------- Captured stderr call -------------------------------------------------------------------------------------------
Coverage.py warning: --include is ignored because --source is set (include-ignored)
Coverage.py warning: --include is ignored because --source is set (include-ignored)
-------------------------------------------------------------------------------------------- Captured log call ---------------------------------------------------------------------------------------------
INFO     scrapy.crawler:crawler.py:59 Overridden settings:
{}
INFO     scrapy.extensions.telnet:telnet.py:55 Telnet Password: 70c3dcca4b73dc8a
INFO     scrapy.middleware:middleware.py:45 Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.logstats.LogStats']
_________________________________________________________________________ ERROR at teardown of CrawlTestCase.test_retry_dns_error __________________________________________________________________________
'NoneType' object is not iterable

During handling of the above exception, another exception occurred:
NOTE: Incompatible Exception Representation, displaying natively:

twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
DelayedCalls: (set twisted.internet.base.DelayedCall.debug = True to debug)
<DelayedCall 0x7fd2aa2dc760 [4.991128921508789s] called=0 cancelled=0 LoopingCall<5>(CallLaterOnce.schedule, *(), **{})()>
<DelayedCall 0x7fd2aa2dc640 [59.99077486991882s] called=0 cancelled=0 LoopingCall<60.0>(LogStats.log, *(<SimpleSpider 'simple' at 0x7fd2aa2c5520>,), **{})()>
<DelayedCall 0x7fd2aa2dc3a0 [59.989307165145874s] called=0 cancelled=0 LoopingCall<60>(Downloader._slot_gc, *(), **{})()>
<DelayedCall 0x7fd2aa2dc610 [59.992241621017456s] called=0 cancelled=0 LoopingCall<60.0>(MemoryUsage.update, *(), **{})()>
------------------------------------------------------------------------------------------- Captured stderr call -------------------------------------------------------------------------------------------
Coverage.py warning: --include is ignored because --source is set (include-ignored)
Coverage.py warning: --include is ignored because --source is set (include-ignored)
-------------------------------------------------------------------------------------------- Captured log call ---------------------------------------------------------------------------------------------
INFO     scrapy.crawler:crawler.py:59 Overridden settings:
{}
INFO     scrapy.extensions.telnet:telnet.py:55 Telnet Password: 1cd2803cc8b5b086
INFO     scrapy.middleware:middleware.py:45 Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.logstats.LogStats']
================================================================================================= FAILURES =================================================================================================
__________________________________________________________________________________ CrawlTestCase.test_retry_conn_aborted ___________________________________________________________________________________
'NoneType' object is not iterable

During handling of the above exception, another exception occurred:
NOTE: Incompatible Exception Representation, displaying natively:

twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_retry_conn_aborted> (test_retry_conn_aborted) still running at 120.0 secs
------------------------------------------------------------------------------------------- Captured stderr call -------------------------------------------------------------------------------------------
Coverage.py warning: --include is ignored because --source is set (include-ignored)
Coverage.py warning: --include is ignored because --source is set (include-ignored)
___________________________________________________________________________________ CrawlTestCase.test_retry_conn_failed ___________________________________________________________________________________
'NoneType' object is not iterable

During handling of the above exception, another exception occurred:
NOTE: Incompatible Exception Representation, displaying natively:

twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_retry_conn_failed> (test_retry_conn_failed) still running at 120.0 secs
------------------------------------------------------------------------------------------- Captured stderr call -------------------------------------------------------------------------------------------
Coverage.py warning: --include is ignored because --source is set (include-ignored)
Coverage.py warning: --include is ignored because --source is set (include-ignored)
-------------------------------------------------------------------------------------------- Captured log call ---------------------------------------------------------------------------------------------
INFO     scrapy.crawler:crawler.py:59 Overridden settings:
{}
INFO     scrapy.extensions.telnet:telnet.py:55 Telnet Password: cf1c18ea157e22e4
INFO     scrapy.middleware:middleware.py:45 Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.logstats.LogStats']
____________________________________________________________________________________ CrawlTestCase.test_retry_conn_lost ____________________________________________________________________________________
'NoneType' object is not iterable

During handling of the above exception, another exception occurred:
NOTE: Incompatible Exception Representation, displaying natively:

twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_retry_conn_lost> (test_retry_conn_lost) still running at 120.0 secs
------------------------------------------------------------------------------------------- Captured stderr call -------------------------------------------------------------------------------------------
Coverage.py warning: --include is ignored because --source is set (include-ignored)
Coverage.py warning: --include is ignored because --source is set (include-ignored)
-------------------------------------------------------------------------------------------- Captured log call ---------------------------------------------------------------------------------------------
INFO     scrapy.crawler:crawler.py:59 Overridden settings:
{}
INFO     scrapy.extensions.telnet:telnet.py:55 Telnet Password: 70c3dcca4b73dc8a
INFO     scrapy.middleware:middleware.py:45 Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.logstats.LogStats']
____________________________________________________________________________________ CrawlTestCase.test_retry_dns_error ____________________________________________________________________________________
'NoneType' object is not iterable

During handling of the above exception, another exception occurred:
NOTE: Incompatible Exception Representation, displaying natively:

twisted.internet.defer.TimeoutError: <tests.test_crawl.CrawlTestCase testMethod=test_retry_dns_error> (test_retry_dns_error) still running at 120.0 secs
------------------------------------------------------------------------------------------- Captured stderr call -------------------------------------------------------------------------------------------
Coverage.py warning: --include is ignored because --source is set (include-ignored)
Coverage.py warning: --include is ignored because --source is set (include-ignored)
-------------------------------------------------------------------------------------------- Captured log call ---------------------------------------------------------------------------------------------
INFO     scrapy.crawler:crawler.py:59 Overridden settings:
{}
INFO     scrapy.extensions.telnet:telnet.py:55 Telnet Password: 1cd2803cc8b5b086
INFO     scrapy.middleware:middleware.py:45 Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.logstats.LogStats']
============================================================================================= warnings summary =============================================================================================
.tox/py38/lib/python3.8/site-packages/_pytest/config/__init__.py:1233
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/_pytest/config/__init__.py:1233: PytestConfigWarning: Unknown config option: flake8-ignore

  self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")

.tox/py38/lib/python3.8/site-packages/_pytest/config/__init__.py:1233
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/_pytest/config/__init__.py:1233: PytestConfigWarning: Unknown config option: flake8-max-line-length

  self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")

.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:28
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:28: DeprecationWarning: twisted.test.proto_helpers.StringTransport was deprecated in Twisted 19.7.0: Please use twisted.internet.testing.StringTransport instead.
  from twisted.test.proto_helpers import (

.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:28
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:28: DeprecationWarning: twisted.test.proto_helpers.waitUntilAllDisconnected was deprecated in Twisted 19.7.0: Please use twisted.internet.testing.waitUntilAllDisconnected instead.
  from twisted.test.proto_helpers import (

.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:28
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:28: DeprecationWarning: twisted.test.proto_helpers.EventLoggingObserver was deprecated in Twisted 19.7.0: Please use twisted.internet.testing.EventLoggingObserver instead.
  from twisted.test.proto_helpers import (

.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:1646
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:1646: DeprecationWarning: twisted.web.client.HTTPPageGetter was deprecated in Twisted 16.7.0: please use https://pypi.org/project/treq/ or twisted.web.client.Agent instead
  protocolClass = client.HTTPPageGetter

.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:1675
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:1675: DeprecationWarning: twisted.web.client.HTTPPageGetter was deprecated in Twisted 16.7.0: please use https://pypi.org/project/treq/ or twisted.web.client.Agent instead
  protocolClass = client.HTTPPageGetter

.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:1706
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:1706: DeprecationWarning: twisted.web.client.HTTPPageDownloader was deprecated in Twisted 16.7.0: please use https://pypi.org/project/treq/ or twisted.web.client.Agent instead
  protocolClass = client.HTTPPageDownloader

.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:1716
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/twisted/web/test/test_webclient.py:1716: DeprecationWarning: twisted.web.client.HTTPPageDownloader was deprecated in Twisted 16.7.0: please use https://pypi.org/project/treq/ or twisted.web.client.Agent instead
  protocolClass = client.HTTPPageDownloader

tests/test_crawl.py::CrawlSpiderTestCase::test_response_ssl_certificate
tests/test_crawl.py::CrawlSpiderTestCase::test_response_ssl_certificate_empty_response
/Users/eus/zyte/scrapy/scrapy/core/downloader/contextfactory.py:54: DeprecationWarning: Passing method to twisted.internet.ssl.CertificateOptions was deprecated in Twisted 17.1.0. Please use a combination of insecurelyLowerMinimumTo, raiseMinimumTo, and lowerMaximumSecurityTo instead, as Twisted will correctly configure the method.
  return CertificateOptions(

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform darwin, python 3.8.6-final-0 -----------
Coverage XML written to file coverage.xml

========================================================================================= short test summary info ==========================================================================================
FAILED tests/test_crawl.py::CrawlTestCase::test_retry_conn_aborted
FAILED tests/test_crawl.py::CrawlTestCase::test_retry_conn_failed
FAILED tests/test_crawl.py::CrawlTestCase::test_retry_conn_lost
FAILED tests/test_crawl.py::CrawlTestCase::test_retry_dns_error
ERROR tests/test_crawl.py::CrawlTestCase::test_retry_conn_aborted
ERROR tests/test_crawl.py::CrawlTestCase::test_retry_conn_failed
ERROR tests/test_crawl.py::CrawlTestCase::test_retry_conn_lost
ERROR tests/test_crawl.py::CrawlTestCase::test_retry_dns_error
========================================================== 4 failed, 28 passed, 7 skipped, 1 xfailed, 11 warnings, 4 errors in 521.28s (0:08:41) ===========================================================
/Users/eus/zyte/scrapy/.tox/py38/lib/python3.8/site-packages/testfixtures/logcapture.py:81: UserWarning: LogCapture instances not uninstalled by shutdown, loggers captured:
(None,)
(None,)
(None,)
(None,)
warnings.warn(
ERROR: InvocationError for command /Users/eus/zyte/scrapy/.tox/py38/bin/py.test --cov=scrapy --cov-report=xml --cov-report= tests/test_crawl.py (exited with code 1)
_________________________________________________________________________________________________ summary __________________________________________________________________________________________________
ERROR:   py38: commands failed

Copy link
Member

@Gallaecio Gallaecio left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would love to understand that as well, but, if it works…

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

is_generator_with_return_value raises IndentationError with a flush left doc string
3 participants