Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for async def callbacks. #4269

Merged
merged 2 commits into from Jan 31, 2020
Merged

Support for async def callbacks. #4269

merged 2 commits into from Jan 31, 2020

Conversation

wRAR
Copy link
Contributor

@wRAR wRAR commented Jan 9, 2020

This is not very useful yet, as it doesn't allow yield in callbacks, but it's a first step.

@codecov
Copy link

@codecov codecov bot commented Jan 9, 2020

Codecov Report

Merging #4269 into master will decrease coverage by 0.14%.
The diff coverage is 100%.

@@            Coverage Diff             @@
##           master    #4269      +/-   ##
==========================================
- Coverage   84.06%   83.92%   -0.15%     
==========================================
  Files         166      166              
  Lines        9730     9870     +140     
  Branches     1454     1469      +15     
==========================================
+ Hits         8180     8283     +103     
- Misses       1296     1334      +38     
+ Partials      254      253       -1
Impacted Files Coverage Δ
scrapy/utils/spider.py 75% <100%> (+1.08%) ⬆️
scrapy/utils/test.py 49.35% <0%> (-8.99%) ⬇️
scrapy/utils/ftp.py 23.8% <0%> (-6.2%) ⬇️
scrapy/pipelines/files.py 61.66% <0%> (-3.99%) ⬇️
scrapy/core/downloader/handlers/datauri.py 93.33% <0%> (-0.79%) ⬇️
scrapy/crawler.py 89.26% <0%> (-0.36%) ⬇️
scrapy/core/downloader/handlers/http10.py 100% <0%> (ø) ⬆️
scrapy/http/response/text.py 100% <0%> (ø) ⬆️
scrapy/http/request/__init__.py 100% <0%> (ø) ⬆️
scrapy/core/downloader/handlers/file.py 100% <0%> (ø) ⬆️
... and 14 more

@kmike
Copy link
Member

@kmike kmike commented Jan 15, 2020

Does it allow to return a list of Requests or items? It yes, it'd be nice to test it.

await asyncio.sleep(0.2)
status = await get_from_asyncio_queue(response.status)
self.logger.info("Got response %d" % status)
return [{'id': 1}, {'id': 2}]
Copy link
Member

@kmike kmike Jan 30, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice, so lists of items are supported. What about requests?

@kmike
Copy link
Member

@kmike kmike commented Jan 31, 2020

I'm merging it; test for behavior with Requests can be added separately. Thanks @wRAR!

@kmike kmike merged commit 22f7934 into scrapy:master Jan 31, 2020
2 checks passed
@wRAR wRAR added the asyncio label Aug 17, 2021
@wRAR wRAR deleted the asyncio-parse branch Aug 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants