def make_requests_from_url(self, url):
return scrapy.http.Request(url, headers=DEFAULT_REQUEST_HEADERS)
I know the make_requests_from_url will only called once for the start_urls, and in my opinion, the first request will send the cookie I set in the DEFAULT_REQUEST_HEADERS, however it does not.
2 Share settings between spiders.
I have multiple spiders in the project which share most of the settings like RandomAgentMiddlewareRandomProxyMiddlewareUserAgentDEFAULT_REQUEST_HEADERS and etc, however they are configured inside the settings.py for each spider.
Is it possible to share these settings?
The COOKIES_ENABLED is set to true.
The text was updated successfully, but these errors were encountered:
So the 1st issue is that CookiesMiddleware sets Cookie header even if cookiejar is empty, or, broadly speaking, that it discards Cookie header set on a request instead of adding to it. This happens here. I think this is a valid concern. A pull request to fix that is welcome.
Sorry, I don't get the second issue. All settings defined in settings.py are shared between spiders, you can't configure per-spider settings in settings.py file. What do you mean?
A question about priorities: when creating a Request, if a name is specified both directly as part of headers['Cookie'] and as a value in the cookies argument, which one should be used? I feel tempted to keep the one in cookies, but that's just my opinion.