-
Notifications
You must be signed in to change notification settings - Fork 10.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cookies from the Cookie request header are not processed #1992
Comments
how about the COOKIES_ENABLED from the setting.py ? did you set it to False ? |
So the 1st issue is that CookiesMiddleware sets Cookie header even if cookiejar is empty, or, broadly speaking, that it discards Cookie header set on a request instead of adding to it. This happens here. I think this is a valid concern. A pull request to fix that is welcome. Sorry, I don't get the second issue. All settings defined in settings.py are shared between spiders, you can't configure per-spider settings in settings.py file. What do you mean? |
A question about priorities: when creating a |
@elacuesta yeah, I agree that using value set in |
Reopening as per #4823 |
As setting cookies directly in headers is not an option because of #4823 the only remaining way to set custom cookies is to.. assign it directly to CookieJar (that later will be used by CookiesMiddleware) as requested on #1878 With possibility to directly set Cookie in CookieJar in spider code - we don't need to maintain possibility to process cookies from |
Technically we don’t need to process the |
Real issue is that there are multiple interpretations of this statement:
|
I am new in scrapy, and I meet some problems which I can not get answer from google, so I post it here:
1 Cookie not work even set in DEFAULT_REQUEST_HEADERS:
I know the
make_requests_from_url
will only called once for the start_urls, and in my opinion, the first request will send the cookie I set in theDEFAULT_REQUEST_HEADERS
, however it does not.2 Share settings between spiders.
I have multiple spiders in the project which share most of the settings like
RandomAgentMiddleware
RandomProxyMiddleware
UserAgent
DEFAULT_REQUEST_HEADERS
and etc, however they are configured inside the settings.py for each spider.Is it possible to share these settings?
The
COOKIES_ENABLED
is set to true.The text was updated successfully, but these errors were encountered: