You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Contracts can't be written for callbacks that receive cb_kwargs
Steps to Reproduce
Run scrapy check with the following spider:
class CbKwargsContract(scrapy.Spider):
name = 'cb_kwargs_contract'
def start_requests(self):
yield scrapy.Request("https://httpbin.org/get", cb_kwargs={"arg1": "foo"})
def parse(self, response, arg1):
"""
@url https://httpbin.org/get
"""
self.logger.info(arg1)
Expected behavior:
Ideally would be possible to add args to the contract (e.g., @url https://httpbin.org/get foo)
Actual behavior: Fails with a TypeError
Reproduces how often: 100%
Versions
Scrapy 1.7.3
The text was updated successfully, but these errors were encountered:
jvani
changed the title
Contracts can't be written for callbacks that receive cb_kwargs
Contracts can't be written for callbacks that receive cb_kwargs
Aug 28, 2019
I suspect this is due to the fact that the parsing method is not called as a Deferred callback, and the arguments are interpreted as positional instead of keyword ones. Refactoring the parsing method signature to add a default value for the expected argument (something like arg1=None) does work in that scenario, but I think we need a slightly more complex solution is in order to prevent that restriction.
Scratch that, the request is generated from the contract definition, hence the cb_kwargs attribute is in fact empty.
Description
Contracts can't be written for callbacks that receive
cb_kwargs
Steps to Reproduce
scrapy check
with the following spider:Expected behavior:
Ideally would be possible to add args to the contract (e.g.,
@url https://httpbin.org/get foo
)Actual behavior: Fails with a TypeError
Reproduces how often: 100%
Versions
Scrapy 1.7.3
The text was updated successfully, but these errors were encountered: