Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade PyPy for CI, and test both 3.5 (oldest) and 3.6 (newest) #4504

Merged
merged 28 commits into from Jul 16, 2020
Merged
Show file tree
Hide file tree
Changes from 26 commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
f361899
Upgrade PyPy for CI, and test both 3.5 (oldest) and 3.6 (newest)
Gallaecio Apr 16, 2020
de1ffb8
Merge branch 'master' into pypy
Gallaecio May 6, 2020
6ef0ed9
Merge branch 'master' into pypy
Gallaecio May 7, 2020
63091fb
Merge branch 'master' into pypy
Gallaecio Jun 2, 2020
a3eb069
Log a detailed error message to discover why MockServer is not working
Gallaecio Jun 3, 2020
477b9e2
Go for all lines!
Gallaecio Jun 17, 2020
3bf84af
Disable tests based on mitmproxy while running on PyPy
Gallaecio Jun 17, 2020
35d23a9
Fix test_get_func_args for PyPy 3.6+
Gallaecio Jun 30, 2020
1bf4e26
Make testPayloadDefaultCiphers work regardless of OpenSSL default cip…
Gallaecio Jun 30, 2020
63939cc
Merge remote-tracking branch 'upstream/master' into pypy
Gallaecio Jun 30, 2020
a613667
Crossing fingers…
Gallaecio Jun 30, 2020
f95f924
Rename: testPayloadDefaultCiphers → testPayloadDisabledCipher
Gallaecio Jun 30, 2020
99f0271
Test the PyPy version currently documented as the minimum required ve…
Gallaecio Jul 1, 2020
47f6e0e
Fix the PYPY_VERSION tag
Gallaecio Jul 1, 2020
1dff643
Update the documentation about supported PyPy versions
Gallaecio Jul 1, 2020
e33d459
Also test the latest 3.5 Python version with PyPy
Gallaecio Jul 1, 2020
b671086
Fix the PYPY_VERSION value for the latest 3.5 version
Gallaecio Jul 1, 2020
6fe6aec
Use pinned dependencies for asyncio and PyPy tests against oldest sup…
Gallaecio Jul 1, 2020
6b648e0
Fix PyPy installation for the pypy3-pinned Tox environment
Gallaecio Jul 1, 2020
edf08a2
Try installing Cython
Gallaecio Jul 1, 2020
75dc93d
Maybe PyPy requires lxml 3.6.0?
Gallaecio Jul 1, 2020
7d3dc04
install.rst: minor clarification
Gallaecio Jul 1, 2020
9279af9
lxml 4.0.0 is required on PyPy
Gallaecio Jul 1, 2020
017ec33
Require setuptools 18.5+
Gallaecio Jul 1, 2020
2a47ac7
Revert "Require setuptools 18.5+"
Gallaecio Jul 1, 2020
a221144
Maintain lxml as a dependency if setuptools < 18.5 is used
Gallaecio Jul 1, 2020
04981de
Merge branch 'master' into pypy
Gallaecio Jul 16, 2020
38ae506
Merge branch 'master' into pypy
Gallaecio Jul 16, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
16 changes: 11 additions & 5 deletions .travis.yml
Expand Up @@ -18,19 +18,25 @@ matrix:
- env: TOXENV=typing
python: 3.8

- env: TOXENV=pypy3
- env: TOXENV=pinned
python: 3.5.2
- env: TOXENV=asyncio
- env: TOXENV=asyncio-pinned
python: 3.5.2 # We use additional code to support 3.5.3 and earlier
- env: TOXENV=pypy3-pinned PYPY_VERSION=3-v5.9.0

- env: TOXENV=py
python: 3.5
- env: TOXENV=asyncio
python: 3.5 # We use specific code to support >= 3.5.4, < 3.6
- env: TOXENV=pypy3 PYPY_VERSION=3.5-v7.0.0

- env: TOXENV=py
python: 3.6
- env: TOXENV=pypy3 PYPY_VERSION=3.6-v7.3.1

- env: TOXENV=py
python: 3.7

- env: TOXENV=py PYPI_RELEASE_JOB=true
python: 3.8
dist: bionic
Expand All @@ -42,9 +48,9 @@ matrix:
dist: bionic
install:
- |
if [ "$TOXENV" = "pypy3" ]; then
export PYPY_VERSION="pypy3.5-5.9-beta-linux_x86_64-portable"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think "Installing Scrapy" section should be updated, with the new minimal pypy version?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense.

Once we fix the remaining issue, though, we should probably try and find the lowest version where tests pass.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure we should care too much about supporting old pypy versions; I think it is for advanced users to run Scrapy on pypy, and they could install a recent version anyways, to get best performance & compatibility. That should be fine to pick some version which is fresh enough, and doesn't cause us problems. Of course, knowing a specific pypy version is better, but I'd not put too much effort in figuring it out - if that's easy, let's do it, otherwise - don't bother.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PyPy 5.9 continues to work.

I’ve also moved the FAQ entry about supported Python versions to install.rst, to avoid duplication. The previous information was accurate, though, so if you have any second thoughts about the documentation changes, we can revert that commit and leave the documentation as is.

wget "https://bitbucket.org/squeaky/portable-pypy/downloads/${PYPY_VERSION}.tar.bz2"
if [[ ! -z "$PYPY_VERSION" ]]; then
export PYPY_VERSION="pypy$PYPY_VERSION-linux64"
wget "https://bitbucket.org/pypy/pypy/downloads/${PYPY_VERSION}.tar.bz2"
tar -jxf ${PYPY_VERSION}.tar.bz2
virtualenv --python="$PYPY_VERSION/bin/pypy3" "$HOME/virtualenvs/$PYPY_VERSION"
source "$HOME/virtualenvs/$PYPY_VERSION/bin/activate"
Expand Down
14 changes: 0 additions & 14 deletions docs/faq.rst
Expand Up @@ -64,20 +64,6 @@ Here's an example spider using BeautifulSoup API, with ``lxml`` as the HTML pars

.. _BeautifulSoup's official documentation: https://www.crummy.com/software/BeautifulSoup/bs4/doc/#specifying-the-parser-to-use

.. _faq-python-versions:

What Python versions does Scrapy support?
-----------------------------------------

Scrapy is supported under Python 3.5.2+
under CPython (default Python implementation) and PyPy (starting with PyPy 5.9).
Python 3 support was added in Scrapy 1.1.
PyPy support was added in Scrapy 1.4, PyPy3 support was added in Scrapy 1.5.
Python 2 support was dropped in Scrapy 2.0.

.. note::
For Python 3 support on Windows, it is recommended to use
Anaconda/Miniconda as :ref:`outlined in the installation guide <intro-install-windows>`.

Did Scrapy "steal" X from Django?
---------------------------------
Expand Down
12 changes: 9 additions & 3 deletions docs/intro/install.rst
Expand Up @@ -4,12 +4,18 @@
Installation guide
==================

.. _faq-python-versions:

Supported Python versions
=========================

Scrapy requires Python 3.5.2+, either the CPython implementation (default) or
the PyPy 5.9+ implementation (see :ref:`python:implementations`).


Installing Scrapy
=================

Scrapy runs on Python 3.5.2 or above under CPython (default Python
implementation) and PyPy (starting with PyPy 5.9).

If you're using `Anaconda`_ or `Miniconda`_, you can install the package from
the `conda-forge`_ channel, which has up-to-date packages for Linux, Windows
and macOS.
Expand Down
41 changes: 26 additions & 15 deletions setup.py
Expand Up @@ -18,12 +18,37 @@ def has_environment_marker_platform_impl_support():
return parse_version(setuptools_version) >= parse_version('18.5')


install_requires = [
'Twisted>=17.9.0',
'cryptography>=2.0',
'cssselect>=0.9.1',
'parsel>=1.5.0',
'PyDispatcher>=2.0.5',
'pyOpenSSL>=16.2.0',
'queuelib>=1.4.2',
'service_identity>=16.0.0',
'w3lib>=1.17.0',
'zope.interface>=4.1.3',
'protego>=0.1.15',
'itemadapter>=0.1.0',
]
extras_require = {}

if has_environment_marker_platform_impl_support():
extras_require[':platform_python_implementation == "CPython"'] = [
'lxml>=3.5.0',
]
extras_require[':platform_python_implementation == "PyPy"'] = [
# Earlier lxml versions are affected by
# https://bitbucket.org/pypy/pypy/issues/2498/cython-on-pypy-3-dict-object-has-no,
# which was fixed in Cython 0.26, released on 2017-06-19, and used to
# generate the C headers of lxml release tarballs published since then, the
# first of which was:
'lxml>=4.0.0',
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is technically possible for someone to install an older version if they re-generate the C headers with Cython 0.26+. If they are willing to do that, however, they can also change this number here locally for a Scrapy version supporting that.

'PyPyDispatcher>=2.1.0',
]
else:
install_requires.append('lxml>=3.5.0')


setup(
Expand Down Expand Up @@ -67,20 +92,6 @@ def has_environment_marker_platform_impl_support():
'Topic :: Software Development :: Libraries :: Python Modules',
],
python_requires='>=3.5.2',
install_requires=[
'Twisted>=17.9.0',
'cryptography>=2.0',
'cssselect>=0.9.1',
'lxml>=3.5.0',
'parsel>=1.5.0',
'PyDispatcher>=2.0.5',
'pyOpenSSL>=16.2.0',
'queuelib>=1.4.2',
'service_identity>=16.0.0',
'w3lib>=1.17.0',
'zope.interface>=4.1.3',
'protego>=0.1.15',
'itemadapter>=0.1.0',
],
install_requires=install_requires,
extras_require=extras_require,
)
2 changes: 2 additions & 0 deletions tests/test_proxy_connect.py
Expand Up @@ -59,6 +59,8 @@ def _wrong_credentials(proxy_url):

@skipIf(sys.version_info < (3, 5, 4),
"requires mitmproxy < 3.0.0, which these tests do not support")
@skipIf("pypy" in sys.executable,
"mitmproxy does not support PyPy")
class ProxyConnectTestCase(TestCase):

def setUp(self):
Expand Down
6 changes: 5 additions & 1 deletion tests/test_utils_python.py
Expand Up @@ -4,6 +4,7 @@
import platform
import unittest
from itertools import count
from sys import version_info
from warnings import catch_warnings

from scrapy.utils.python import (
Expand Down Expand Up @@ -214,9 +215,12 @@ def __call__(self, a, b, c):
else:
self.assertEqual(
get_func_args(str.split, stripself=True), ['sep', 'maxsplit'])
self.assertEqual(get_func_args(" ".join, stripself=True), ['list'])
self.assertEqual(
get_func_args(operator.itemgetter(2), stripself=True), ['obj'])
if version_info < (3, 6):
self.assertEqual(get_func_args(" ".join, stripself=True), ['list'])
else:
self.assertEqual(get_func_args(" ".join, stripself=True), ['iterable'])

def test_without_none_values(self):
self.assertEqual(without_none_values([1, None, 3, 4]), [1, 3, 4])
Expand Down
6 changes: 4 additions & 2 deletions tests/test_webclient.py
Expand Up @@ -414,7 +414,9 @@ def testPayload(self):
self.getURL("payload"), body=s, contextFactory=client_context_factory
).addCallback(self.assertEqual, to_bytes(s))

def testPayloadDefaultCiphers(self):
def testPayloadDisabledCipher(self):
s = "0123456789" * 10
d = getPage(self.getURL("payload"), body=s, contextFactory=ScrapyClientContextFactory())
settings = Settings({'DOWNLOADER_CLIENT_TLS_CIPHERS': 'ECDHE-RSA-AES256-GCM-SHA384'})
client_context_factory = create_instance(ScrapyClientContextFactory, settings=settings, crawler=None)
d = getPage(self.getURL("payload"), body=s, contextFactory=client_context_factory)
return self.assertFailure(d, OpenSSL.SSL.Error)
39 changes: 27 additions & 12 deletions tox.ini
Expand Up @@ -58,19 +58,12 @@ deps =
commands =
pylint conftest.py docs extras scrapy setup.py tests

[testenv:pypy3]
basepython = pypy3
commands =
py.test {posargs:--durations=10 docs scrapy tests}

[testenv:pinned]
basepython = python3
[pinned-common]
deps =
-ctests/constraints.txt
cryptography==2.0
cssselect==0.9.1
itemadapter==0.1.0
lxml==3.5.0
parsel==1.5.0
Protego==0.1.15
PyDispatcher==2.0.5
Expand All @@ -85,12 +78,38 @@ deps =
botocore==1.3.23
Pillow==3.4.2

[testenv:pinned]
deps =
{[pinned-common]deps}
lxml==3.5.0

[testenv:extra-deps]
deps =
{[testenv]deps}
reppy
robotexclusionrulesparser

[testenv:asyncio]
commands =
{[testenv]commands} --reactor=asyncio

[testenv:asyncio-pinned]
commands = {[testenv:asyncio]commands}
deps = {[testenv:pinned]deps}

[testenv:pypy3]
basepython = pypy3
commands =
py.test {posargs:--durations=10 docs scrapy tests}

[testenv:pypy3-pinned]
basepython = {[testenv:pypy3]basepython}
commands = {[testenv:pypy3]commands}
deps =
{[pinned-common]deps}
lxml==4.0.0
PyPyDispatcher==2.1.0

[docs]
changedir = docs
deps =
Expand Down Expand Up @@ -122,7 +141,3 @@ deps = {[docs]deps}
setenv = {[docs]setenv}
commands =
sphinx-build -W -b linkcheck . {envtmpdir}/linkcheck

[testenv:asyncio]
commands =
{[testenv]commands} --reactor=asyncio