You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
But now, I have the problem with LOG_LEVEL parameter.
settings = {
'USER_AGENT': 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.85 Safari/537.36',
'LOG_ENABLED': True,
'LOG_LEVEL': 'ERROR'
}
process = CrawlerProcess(settings=settings)
When execute this code in local, all is correct, only receive the LOG_LEVEL: ERROR, but when execute this code in AWS Lambda, I receive the LOG_LEVEL: DEBUG, and I don´t know how to resolve.
Is like the Spider doesn´t receive the settings of the Crawler. Any help?
Thanks!
The text was updated successfully, but these errors were encountered:
I run Scrapy from script https://doc.scrapy.org/en/latest/topics/practices.html#run-scrapy-from-a-script to launch a script from AWS Lambda. I compile the project with SAM and everything is correct.
But now, I have the problem with LOG_LEVEL parameter.
When execute this code in local, all is correct, only receive the LOG_LEVEL: ERROR, but when execute this code in AWS Lambda, I receive the LOG_LEVEL: DEBUG, and I don´t know how to resolve.
Is like the Spider doesn´t receive the settings of the Crawler. Any help?
Thanks!
The text was updated successfully, but these errors were encountered: