Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests/test_command_check.py::CheckCommandTest depends on example.com #5404

Closed
wRAR opened this issue Feb 8, 2022 · 2 comments · Fixed by #5407
Closed

tests/test_command_check.py::CheckCommandTest depends on example.com #5404

wRAR opened this issue Feb 8, 2022 · 2 comments · Fixed by #5407
Labels

Comments

@wRAR
Copy link
Member

wRAR commented Feb 8, 2022

Sometimes we get

'OK' not in '/home/runner/work/scrapy/scrapy/.tox/asyncio/lib/python3.10/site-packages/coverage/inorout.py:472: CoverageWarning: --include is ignored because --source is set (include-ignored)\n self.warn("--include is ignored because --source is set", slug="include-ignored")\nE\n======================================================================\nERROR: [check_spider] parse (errback)\n----------------------------------------------------------------------\nTraceback (most recent call last):\n File "/home/runner/work/scrapy/scrapy/.tox/asyncio/lib/python3.10/site-packages/twisted/internet/defer.py", line 1660, in _inlineCallbacks\n result = current_context.run(gen.send, result)\nStopIteration: <404 http://example.com>\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/home/runner/work/scrapy/scrapy/scrapy/core/spidermw.py", line 52, in _process_spider_input\n result = method(response=response, spider=spider)\n File "/home/runner/work/scrapy/scrapy/scrapy/spidermiddlewares/httperror.py", line 45, in process_spider_input\n raise HttpError(response, \'Ignoring non-200 response\')\nscrapy.spidermiddlewares.httperror.HttpError: Ignoring non-200 response\n\n----------------------------------------------------------------------\nRan 0 contracts in 0.139s\n\nFAILED (errors=1)\n'

This seems to be caused by getting 404 from http://example.com which I can reproduce on some requests. We shouldn't make actual network requests in any case, but in this case this is also breaking the test.

@wRAR
Copy link
Member Author

wRAR commented Feb 10, 2022

Should we mark these as xfail for now?

@Gallaecio
Copy link
Member

I’m OK with that as a temporary measure until we stop using http://example.com at all. Although it may be better to point them instead to e.g. https://toscrape.com/ , which I imagine will get rid of the issue most of the time (but still, as you said, the end goal should remain not to make network requests if avoidable).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants