Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

macOS安装Scrapy #120

Open
Qingquan-Li opened this issue Apr 20, 2019 · 0 comments
Open

macOS安装Scrapy #120

Qingquan-Li opened this issue Apr 20, 2019 · 0 comments
Labels

Comments

@Qingquan-Li
Copy link
Owner

开发环境:

  • macOS Sierra 版本 10.12.6
  • Python 3.5

本文使用普通的方式 Python + pip ,安装Scrapy,需要安装 N 多个依赖包,并且最后因为依赖包问题未能成功安装使用Scrapy。(解决依赖一般会让你怀疑人生、怀疑 Scrapy ,建议使用 Anaconda 。)

建议您安装 Anaconda 或 Miniconda 并使用 conda-forge 通道中的软件包 ,这样可以避免大多数安装问题。——Scrapy官方文档

conda install --channel conda-forge scrapy


一、建立虚拟环境并更新pip

新建虚拟环境可以有效避免与已安装的Python系统包冲突。

# 创建虚拟环境
➜  /Users/fatli/python/test02 > python3.5 -m venv myvenv
# 激活虚拟环境
➜  /Users/fatli/python/test02 > source myvenv/bin/activate
(myvenv) ➜  /Users/fatli/python/test02 > pip3 -V
pip 8.1.1 from /Users/fatli/python/test02/myvenv/lib/python3.5/site-packages (python 3.5)
# 更新pip
(myvenv) ➜  /Users/fatli/python/test02 > curl https://bootstrap.pypa.io/get-pip.py | python3.5
# 省略......
      Successfully uninstalled pip-8.1.1
Successfully installed pip-19.0.3 wheel-0.33.1
(myvenv) ➜  /Users/fatli/python/test02 > pip3 -V
pip 19.0.3 from /Users/fatli/python/test02/myvenv/lib/python3.5/site-packages/pip (python 3.5)

此时的 pip list:

(myvenv) ➜  /Users/fatli/python/test02 > pip3 list
Package    Version
---------- -------
pip        19.0.3
setuptools 20.10.1
wheel      0.33.1


二、安装Scrapy

方法一:直接使用pip安装Scrapy(失败):

(myvenv) ➜  /Users/fatli/python/test02 > pip3 install scrapy
# 省略......
Command "python setup.py egg_info" failed with error code 1 in /private/var/folders/bf/cq8wzfms3glgschngtqys5f80000gn/T/pip-install-nqgmdnyg/Twisted/
# 推断:应该需要安装Twisted👆

# 此时的 pip list:
(myvenv) ➜  /Users/fatli/python/test02 > pip3 list
Package    Version
---------- -------
pip        19.0.3
setuptools 20.10.1
wheel      0.33.1

方法二:下载安装Scrapy(成功):

直接使用pip安装Scrapy失败。建议下载安装,下载解压后,在该包命令行目录下使用setup.py安装:
scrapy-master.zip下载地址:https://scrapy.org/download

(myvenv) ➜  /Users/fatli/python/test02/scrapy-master > python3 setup.py install
# 省略......
creating dist
creating 'dist/Scrapy-1.6.0-py3.5.egg' and adding 'build/bdist.macosx-10.6-intel/egg' to it
removing 'build/bdist.macosx-10.6-intel/egg' (and everything under it)
Processing Scrapy-1.6.0-py3.5.egg
creating /Users/fatli/python/test02/myvenv/lib/python3.5/site-packages/Scrapy-1.6.0-py3.5.egg
Extracting Scrapy-1.6.0-py3.5.egg to /Users/fatli/python/test02/myvenv/lib/python3.5/site-packages
Adding Scrapy 1.6.0 to easy-install.pth file
Installing scrapy script to /Users/fatli/python/test02/myvenv/bin

Installed /Users/fatli/python/test02/myvenv/lib/python3.5/site-packages/Scrapy-1.6.0-py3.5.egg
Processing dependencies for Scrapy==1.6.0
Searching for service_identity
Reading https://pypi.python.org/simple/service_identity/
Download error on https://pypi.python.org/simple/service_identity/: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:645) -- Some packages may not be found!
Reading https://pypi.python.org/simple/service-identity/
Download error on https://pypi.python.org/simple/service-identity/: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:645) -- Some packages may not be found!
Couldn't find index page for 'service_identity' (maybe misspelled?)
Scanning index of all packages (this may take a while)
Reading https://pypi.python.org/simple/
Download error on https://pypi.python.org/simple/: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:645) -- Some packages may not be found!
No local packages or download links found for service_identity
error: Could not find suitable distribution for Requirement.parse('service_identity')
# 此时Scrapy已经安装好,但是缺少某些依赖包,例如这里输出显示的”service_identity“包,将导致运行scrapy失败。

# 此时的 pip list:
(myvenv) ➜  /Users/fatli/python/test02/scrapy-master > pip3 list
Package    Version
---------- -------
pip        19.0.3
Scrapy     1.6.0
setuptools 20.10.1
wheel      0.33.1


三、安装Scrapy依赖包

(myvenv) ➜  /Users/fatli/python/test02/scrapy-master > pip3 install service_identity
# 省略......
scrapy 1.6.0 requires cssselect>=0.9, which is not installed.
scrapy 1.6.0 requires lxml, which is not installed.
scrapy 1.6.0 requires parsel>=1.5, which is not installed.
scrapy 1.6.0 requires PyDispatcher>=2.0.5, which is not installed.
scrapy 1.6.0 requires pyOpenSSL, which is not installed.
scrapy 1.6.0 requires queuelib, which is not installed.
scrapy 1.6.0 requires Twisted>=13.1.0, which is not installed.
scrapy 1.6.0 requires w3lib>=1.17.0, which is not installed.
Installing collected packages: attrs, pyasn1, pyasn1-modules, six, asn1crypto, pycparser, cffi, cryptography, service-identity
Successfully installed asn1crypto-0.24.0 attrs-19.1.0 cffi-1.12.2 cryptography-2.6.1 pyasn1-0.4.5 pyasn1-modules-0.2.4 pycparser-2.19 service-identity-18.1.0 six-1.12.0

# 此时的 pip list:
(myvenv) ➜  /Users/fatli/python/test02/scrapy-master > pip3 list
Package          Version
---------------- -------
asn1crypto       0.24.0
attrs            19.1.0
cffi             1.12.2
cryptography     2.6.1
pip              19.0.3
pyasn1           0.4.5
pyasn1-modules   0.2.4
pycparser        2.19
Scrapy           1.6.0
service-identity 18.1.0
setuptools       20.10.1
six              1.12.0
wheel            0.33.1

# 根据上面输出命令提示,直接使用pip安装lxml等依赖包,除了Twisted包。
(myvenv) ➜  /Users/fatli/python/test02/scrapy-master > pip3 install Twisted
# 省略......
Command "python setup.py egg_info" failed with error code 1 in /private/var/folders/bf/cq8wzfms3glgschngtqys5f80000gn/T/pip-install-__uqc53m/Twisted/

# pip安装Twisted失败,解决方法:下载安装:
# 下载Twisted压缩包:https://twistedmatrix.com/trac/wiki/Downloads#SourceTarball
# 解压后进入该包路径进行 $ python3 setup.py install 安装:
(myvenv) ➜  /Users/fatli/python/test02/scrapy-master > cd ../Twisted-19.2.0
(myvenv) ➜  /Users/fatli/python/test02/Twisted-19.2.0 > python3 setup.py install
# 省略......
distutils.errors.DistutilsError: Could not find suitable distribution for Requirement.parse('incremental>=16.10.1')
# 根据输出命令,直接pip安装例如incremental等Twisted的依赖包。

# 在Twisted-19.2.0目录下,再次执行python3 setup.py install,成功了
(myvenv) ➜  /Users/fatli/python/test02/Twisted-19.2.0 > python3 setup.py install
# 省略......
Using /Users/fatli/python/test02/myvenv/lib/python3.5/site-packages
Finished processing dependencies for Twisted==19.2.0

此时的 pip list:

(myvenv) ➜  /Users/fatli/python/test02/Twisted-19.2.0 > pip3 list
Package          Version
---------------- -------
asn1crypto       0.24.0
attrs            19.1.0
Automat          0.7.0
cffi             1.12.2
constantly       15.1.0
cryptography     2.6.1
cssselect        1.0.3
hyperlink        19.0.0
idna             2.8
incremental      17.5.0
lxml             4.3.3
parsel           1.5.1
pip              19.0.3
pyasn1           0.4.5
pyasn1-modules   0.2.4
pycparser        2.19
PyDispatcher     2.0.5
PyHamcrest       1.9.0
pyOpenSSL        19.0.0
queuelib         1.5.0
Scrapy           1.6.0
service-identity 18.1.0
setuptools       20.10.1
six              1.12.0
Twisted          19.2.0
w3lib            1.20.0
wheel            0.33.1
zope.interface   4.6.0


四、测试使用Scrapy

# 根据上面的pip list输出,Scrapy已经安装好了。测试此时Scrapy是否能运行。报错:缺少依赖包:service_identity
(myvenv) ➜  /Users/fatli/python/test02/Twisted-19.2.0 > scrapy version
# 省略......
pkg_resources.DistributionNotFound: The 'service_identity' distribution was not found and is required by Scrapy

# 报错:缺少依赖包:service_identity:
(myvenv) ➜  /Users/fatli/python/test02/Twisted-19.2.0 > cd ../scrapy-master
(myvenv) ➜  /Users/fatli/python/test02/scrapy-master > python3 setup.py install
# 省略......
pkg_resources.DistributionNotFound: The 'service_identity' distribution was not found and is required by Scrapy

# 注意:上面“安装Scrapy依赖包”时已经安装该包。根据上面“下载安装Scrapy”输出显示,可能是SSL的问题。
# 此时,暂无法解决。
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant