Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scrapy + splash: no module named scrapy_splash #2492

Closed
tituskex opened this issue Jan 11, 2017 · 4 comments
Closed

Scrapy + splash: no module named scrapy_splash #2492

tituskex opened this issue Jan 11, 2017 · 4 comments

Comments

@tituskex
Copy link

Hi,

I'm trying to learn how to work with splash for scrapy. I'm doing this tutorial: https://github.com/scrapy-plugins/scrapy-splash.

I've created a scrapy project which I added to this post. When I run $ scrapy crawl spider1 everything works fine. However, when I add: DOWNLOADER_MIDDLEWARES = { 'scrapy_splash.SplashCookiesMiddleware': 723, 'scrapy_splash.SplashMiddleware': 725, 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware': 810, }

I get a message saying: ModuleNotFoundError: No module named 'scrapy_splash'. I've checked whether I have scrapy_splash installed with:

$ User-MacBook-Air:tScraper username$ pip3 show scrapy_splash
Name: scrapy-splash
Version: 0.7.1
Summary: JavaScript support for Scrapy using Splash
Home-page: https://github.com/scrapy-plugins/scrapy-splash
Author: Mikhail Korobov
Author-email: kmike84@gmail.com
License: BSD
Location: /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages
Requires:

I've tried to import scrapy_splash into my spider script and into my settings script. If I do so, I get a message saying:

raise KeyError("Spider not found: {}".format(spider_name))
KeyError: 'Spider not found: spider1'

My script: tScraper.zip

Does anyone know how to fix this issue?

@redapple
Copy link
Contributor

redapple commented Jan 16, 2017

I don't think KeyError: 'Spider not found: spider1' is related to scrapy-splash install.
"Spider not found" usually happens when settings are not correct, maybe you are running outside of a project.
If you run your script outside of a scrapy project (i.e. if you don't have a scrapy.cfg at the same level), you need to tell scrapy where to find spider classes with the SPIDER_MODULES setting,
and these modules need to be in your Python path so that your script can load them.
I wrote an answer on StackOverflow for a similar issue. You may find it useful.

@kmike
Copy link
Member

kmike commented Jan 19, 2017

Hey @tituskex,

It seems you have some configuration issue; most likely, scrapy for some reason uses a different python from pip3. The best way to prevent such issues is to always use virtualenv.

We're using Scrapy bug tracker for tracking Scrapy bugs and feature requests; it doesn't look like a Scrapy bug, so I'm closing this ticket. A better way to ask support questions is to use http://stackoverflow.com (use 'scrapy' tag).

@kmike kmike closed this as completed Jan 19, 2017
@Phunter813
Copy link

@kmike i'm also getting the same error and i'm already using virtualenv,

@IAlwaysBeCoding
Copy link
Contributor

IAlwaysBeCoding commented Dec 6, 2018

Never mind, it was related to the pip3 mixing it up with pip.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants