2024-04-25 01:07:39 [scrapy.utils.log] INFO: Scrapy 2.11.0 started (bot: gazette) 2024-04-25 01:07:39 [scrapy.utils.log] INFO: Versions: lxml 4.9.3.0, libxml2 2.10.3, cssselect 1.2.0, parsel 1.8.1, w3lib 2.1.2, Twisted 22.10.0, Python 3.9.13 | packaged by conda-forge | (main, May 27 2022, 17:00:33) - [Clang 13.0.1 ], pyOpenSSL 23.2.0 (OpenSSL 3.1.3 19 Sep 2023), cryptography 41.0.4, Platform macOS-14.4-arm64-arm-64bit 2024-04-25 01:07:39 [ba_andorinha] INFO: Collecting data from 2013-01-02 to 2024-04-25. 2024-04-25 01:07:39 [scrapy.addons] INFO: Enabled addons: [] 2024-04-25 01:07:39 [py.warnings] WARNING: /Users/csamp/Documents/querido-diario/.venv/lib/python3.9/site-packages/scrapy/utils/request.py:254: ScrapyDeprecationWarning: '2.6' is a deprecated value for the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting. It is also the default value. In other words, it is normal to get this warning if you have not defined a value for the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting. This is so for backward compatibility reasons, but it will change in a future version of Scrapy. See the documentation of the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting for information on how to handle this deprecation. return cls(crawler) 2024-04-25 01:07:39 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.selectreactor.SelectReactor 2024-04-25 01:07:39 [scrapy.extensions.telnet] INFO: Telnet Password: 7943be0962469a06 2024-04-25 01:07:39 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.memusage.MemoryUsage', 'scrapy.extensions.logstats.LogStats', 'spidermon.contrib.scrapy.extensions.Spidermon', 'gazette.extensions.StatsPersist'] 2024-04-25 01:07:39 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'gazette', 'COMMANDS_MODULE': 'gazette.commands', 'DOWNLOAD_TIMEOUT': 360, 'FILES_STORE_S3_ACL': 'public-read', 'LOG_FILE': 'log_ba_andorinha.txt', 'NEWSPIDER_MODULE': 'gazette.spiders', 'SPIDER_MODULES': ['gazette.spiders'], 'TEMPLATES_DIR': 'templates', 'USER_AGENT': 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:108.0) ' 'Gecko/20100101 Firefox/108.0'} 2024-04-25 01:07:39 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy_zyte_smartproxy.ZyteSmartProxyMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2024-04-25 01:07:39 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2024-04-25 01:07:39 [scrapy.middleware] INFO: Enabled item pipelines: ['gazette.pipelines.GazetteDateFilteringPipeline', 'gazette.pipelines.DefaultValuesPipeline', 'gazette.pipelines.QueridoDiarioFilesPipeline', 'spidermon.contrib.scrapy.pipelines.ItemValidationPipeline', 'gazette.pipelines.SQLDatabasePipeline'] 2024-04-25 01:07:39 [scrapy.core.engine] INFO: Spider opened 2024-04-25 01:07:39 [gazette.database.models] INFO: Populating 'querido_diario_spider' table - Please wait! 2024-04-25 01:07:39 [gazette.database.models] INFO: Populating 'querido_diario_spider' table - Done! 2024-04-25 01:07:39 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2024-04-25 01:07:39 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2024-04-25 01:07:39 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying (failed 1 times): DNS lookup failed: no results for hostname lookup: adorinha.ba.gov.br. 2024-04-25 01:07:39 [scrapy.downloadermiddlewares.retry] DEBUG: Retrying (failed 2 times): DNS lookup failed: no results for hostname lookup: adorinha.ba.gov.br. 2024-04-25 01:07:39 [scrapy.downloadermiddlewares.retry] ERROR: Gave up retrying (failed 3 times): DNS lookup failed: no results for hostname lookup: adorinha.ba.gov.br. 2024-04-25 01:07:39 [scrapy.core.scraper] ERROR: Error downloading Traceback (most recent call last): File "/Users/csamp/Documents/querido-diario/.venv/lib/python3.9/site-packages/twisted/internet/defer.py", line 1693, in _inlineCallbacks result = context.run( File "/Users/csamp/Documents/querido-diario/.venv/lib/python3.9/site-packages/twisted/python/failure.py", line 518, in throwExceptionIntoGenerator return g.throw(self.type, self.value, self.tb) File "/Users/csamp/Documents/querido-diario/.venv/lib/python3.9/site-packages/scrapy/core/downloader/middleware.py", line 54, in process_request return (yield download_func(request=request, spider=spider)) File "/Users/csamp/Documents/querido-diario/.venv/lib/python3.9/site-packages/twisted/internet/defer.py", line 892, in _runCallbacks current.result = callback( # type: ignore[misc] File "/Users/csamp/Documents/querido-diario/.venv/lib/python3.9/site-packages/twisted/internet/endpoints.py", line 1022, in startConnectionAttempts raise error.DNSLookupError( twisted.internet.error.DNSLookupError: DNS lookup failed: no results for hostname lookup: adorinha.ba.gov.br. 2024-04-25 01:07:39 [scrapy.core.engine] INFO: Closing spider (finished) 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] ------------------------------ MONITORS ------------------------------ 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] Comparison Between Executions/Days without gazettes... FAIL 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] Requests/Items Ratio/Ratio of requests over items scraped count... OK 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] Error Count Monitor/test_stat_monitor... FAIL 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] Finish Reason Monitor/Should have the expected finished reason(s)... OK 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] Item Validation Monitor/test_stat_monitor... SKIPPED (Unable to find 'spidermon/validation/fields/errors' in job stats.) 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-04-25 01:07:39 [ba_andorinha] ERROR: [Spidermon] ====================================================================== FAIL: Comparison Between Executions/Days without gazettes ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/csamp/Documents/querido-diario/data_collection/gazette/monitors.py", line 69, in test_days_without_gazettes self.assertNotEqual( AssertionError: 0 == 0 : No gazettes scraped in the last 7 days. 2024-04-25 01:07:39 [ba_andorinha] ERROR: [Spidermon] ====================================================================== FAIL: Error Count Monitor/test_stat_monitor ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/csamp/Documents/querido-diario/.venv/lib/python3.9/site-packages/spidermon/contrib/scrapy/monitors/base.py", line 225, in test_stat_monitor assertion_method( AssertionError: Expecting 'log_count/ERROR' to be '<=' to '0.0'. Current value: '2' 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] 5 monitors in 0.015s 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] FAILED (failures=2, skipped=1) 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] -------------------------- FINISHED ACTIONS -------------------------- 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] 0 actions in 0.000s 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] OK 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] --------------------------- PASSED ACTIONS --------------------------- 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] 0 actions in 0.000s 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] OK 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] --------------------------- FAILED ACTIONS --------------------------- 2024-04-25 01:07:39 [spidermon.contrib.actions.discord] INFO: *ba_andorinha* finished - Finish time: *2024-04-25 04:07:39.678506+00:00* - Gazettes scraped: *0* - 🔥 2 failures 🔥 ===== FAILURES ===== Comparison Between Executions/Days without gazettes: 0 == 0 : No gazettes scraped in the last 7 days. Error Count Monitor/test_stat_monitor: Expecting 'log_count/ERROR' to be '<=' to '0.0'. Current value: '2' 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] CustomSendDiscordMessage... OK 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] ---------------------------------------------------------------------- 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] 1 action in 0.000s 2024-04-25 01:07:39 [ba_andorinha] INFO: [Spidermon] OK 2024-04-25 01:07:39 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/exception_count': 3, 'downloader/exception_type_count/twisted.internet.error.DNSLookupError': 3, 'downloader/request_bytes': 840, 'downloader/request_count': 3, 'downloader/request_method_count/GET': 3, 'elapsed_time_seconds': 0.222477, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2024, 4, 25, 4, 7, 39, 678506, tzinfo=datetime.timezone.utc), 'log_count/DEBUG': 3, 'log_count/ERROR': 4, 'log_count/INFO': 35, 'log_count/WARNING': 1, 'memusage/max': 127369216, 'memusage/startup': 127369216, 'retry/count': 2, 'retry/max_reached': 1, 'retry/reason_count/twisted.internet.error.DNSLookupError': 2, 'scheduler/dequeued': 3, 'scheduler/dequeued/memory': 3, 'scheduler/enqueued': 3, 'scheduler/enqueued/memory': 3, 'spidermon/validation/validators': 1, 'spidermon/validation/validators/item/jsonschema': True, 'start_time': datetime.datetime(2024, 4, 25, 4, 7, 39, 456029, tzinfo=datetime.timezone.utc)} 2024-04-25 01:07:39 [scrapy.core.engine] INFO: Spider closed (finished)