-
Notifications
You must be signed in to change notification settings - Fork 570
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
scrapyd writing logs to console instead of log files in docker container #470
Comments
same question. I change the scrapy version to 2.7.1 and the log come back. |
confirmed, that was it. holding the scrapy version at 2.7.1 solves this problem. thx. |
@blacksteel1288 So does the problem occur when using Scrapy 2.8 in combination with Scrapyd? |
Merging into #369 as duplicate. |
Hi @jpmckinney, I tested this patch and verified it does work correctly with scrapy 2.8 -- log files created as expected. Will there be a new release of scrapyd soon? Thank you! |
1.4.0 is now available: https://pypi.org/project/scrapyd/ 🎉 |
great, thank you. you may also want to bump the release here in the repo to match pypi -- https://github.com/scrapy/scrapyd/releases |
Thanks for the reminder! Done now. |
I've been using scrapyd with this docker configuration for a while with no issues, but now scrapyd is only sending the scrapy logs to the console instead of a log file in the /logs directory. I'm not sure what has changed, but I found this when doing a new install, and now both the new and old install have the same issue.
I can access scrapyd at the web url with no issues, and run a spider, but when browsing to the log directory of the spider (e.g.
/logs/{project}/{myspider}
), no log files are available there or in the data directories, if I access them directly via command line.Since the parent spider directory for the logs is being created successfully automatically by scrapyd, I don't believe this is a permissions issue.
Here's the relevant files:
docker-compose.yml
Dockerfile
requirements.txt
start-scrapyd.sh
scrapyd.conf
And, specifically in my scrapy settings.py file, these are the only LOG settings used:
I've tried various troubleshooting, including not mapping any volumes in docker-compose and letting scrapyd create the directories/files inside the container only to see if it was a permissions issue, but still the same issue -- no log files, just scrapy logs sent to the console instead of a logfile when running a spider.
I don't see any error messages or relevant logs. I also tried setting debug = on inside scrapyd.conf, but no additional info was shown.
Is it possible that some upstream library has changed that would cause this?
Here's the results of a "
pip list
" insider the container:The text was updated successfully, but these errors were encountered: