You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Debian supports testing installed packages by running a test suite against them. Doing that for Scrapy is currently impossible, it contains explicit protection against this: scrapy.utils.test.get_pythonpath() ensures PYTHONPATH for external scripts (including mockserver) points to the same instance of Scrapy. So:
if you run pytest in the Scrapy repo, tests will import scrapy.utils from the current dir and run python3 -m tests.mockserver with PYTHONPATH set to the current dir, testing the local code
if you also install Scrapy, nothing will change
if you remove the scrapy dir, which is what is usually done to test the installed code, tests will import scrapy.utils from the system directory, set PYTHONPATH to the system directory and python3 -m tests.mockserver will fail because the tests package is not installed.
I know some Python modules also install the tests package but I don't think this is a good idea. Maybe get_pythonpath() should learn how to find the instance we want, not sure how to do that though.
The text was updated successfully, but these errors were encountered:
tests/mockserver.py - this is the main use, it's used to run python3 -m tests.mockserver via Popen so that the interpreter finds the module.
tests/test_crawler.py - used to run scripts from tests/Crawler* via Popen. Not actually needed as far as I can see, except for tests/CrawlerRunner/ip_address.py which imports tests.mockserver.
tests/test_commands.py - used to run python3 -m scrapy.cmdline via Popen. Needed to find the correct scrapy module and is fine to keep (and also suggests that we can't drop the functions).
tests/test_proxy_connect.py - used to run python3 -c <inline script> via Popen. I don't think it's needed, as the scripts just imports mitmproxy and it doesn't need the Scrapy dir for this.
tests/test_cmdline/__init__.py - used to run python3 -m scrapy.cmdline via Popen, see 3.
Debian supports testing installed packages by running a test suite against them. Doing that for Scrapy is currently impossible, it contains explicit protection against this:
scrapy.utils.test.get_pythonpath()
ensuresPYTHONPATH
for external scripts (including mockserver) points to the same instance of Scrapy. So:scrapy.utils
from the current dir and runpython3 -m tests.mockserver
with PYTHONPATH set to the current dir, testing the local codescrapy
dir, which is what is usually done to test the installed code, tests will importscrapy.utils
from the system directory, set PYTHONPATH to the system directory andpython3 -m tests.mockserver
will fail because thetests
package is not installed.I know some Python modules also install the
tests
package but I don't think this is a good idea. Maybeget_pythonpath()
should learn how to find the instance we want, not sure how to do that though.The text was updated successfully, but these errors were encountered: