Skip to content

Conversation

@elacuesta
Copy link
Member

@elacuesta elacuesta commented Jan 21, 2023

Closes #155

Log records emitted after the spider is opened will have a spider attribute.

Pattern taken from https://docs.python.org/3/library/logging.html#logrecord-objects.

@codecov
Copy link

codecov bot commented Jan 21, 2023

Codecov Report

Merging #156 (f294c5b) into main (11a78a4) will not change coverage.
The diff coverage is 100.00%.

@@            Coverage Diff            @@
##              main      #156   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files            4         4           
  Lines          349       359   +10     
=========================================
+ Hits           349       359   +10     
Impacted Files Coverage Δ
scrapy_playwright/handler.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@elacuesta elacuesta merged commit 75fe598 into main Jan 24, 2023
@elacuesta elacuesta deleted the logging-record-spider-attribute branch January 24, 2023 13:18
@michaelbrunnbauer
Copy link

This change apparently broke my setup with scrapyrt - you might want to reconsider:


            record = old_factory(name, *args, **kwargs)
          File "/home/scrapy/scrapy/lib/python3.8/site-packages/scrapy_playwright/handler.py", line 118, in record_factory
            record = old_factory(name, *args, **kwargs)
          File "/home/scrapy/scrapy/lib/python3.8/site-packages/scrapy_playwright/handler.py", line 118, in record_factory
            record = old_factory(name, *args, **kwargs)
          File "/home/scrapy/scrapy/lib/python3.8/site-packages/scrapy_playwright/handler.py", line 118, in record_factory
            record = old_factory(name, *args, **kwargs)
          File "/usr/lib/python3.8/logging/__init__.py", line 314, in __init__
            if (args and len(args) == 1 and isinstance(args[0], collections.abc.Mapping)
          File "/usr/lib/python3.8/abc.py", line 98, in __instancecheck__
            return _abc_instancecheck(cls, instance)
        builtins.RecursionError: maximum recursion depth exceeded in comparison

@elacuesta
Copy link
Member Author

Interesting, please open a new issue with a minimal, reproducible example.

@michaelbrunnbauer
Copy link

Unfortunately, I do not have the time to do this. My knowledge of the complex ecosystem around scrapy is superficial and the problem does not occur immediately but only after my scrapyrt service runs for a while. After the failure condition occurs, every request causes the exception, though.

@elacuesta
Copy link
Member Author

@michaelbrunnbauer Should be fixed at v0.0.26

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add the spider's attribute to scrapy-playwright log records

3 participants