Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spider has no attribute 'update_settings' #1849

Closed
DeckerCHAN opened this issue Mar 6, 2016 · 6 comments
Closed

Spider has no attribute 'update_settings' #1849

DeckerCHAN opened this issue Mar 6, 2016 · 6 comments

Comments

@DeckerCHAN
Copy link

"D:\Program Files\Python\Python35-32\python.exe" G:/Python/ParkingSearch/entry_point.py
2016-03-06 12:43:40 [scrapy] INFO: Scrapy 1.0.5 started (bot: scrapybot)
2016-03-06 12:43:40 [scrapy] INFO: Optional features available: http11, ssl
2016-03-06 12:43:40 [scrapy] INFO: Overridden settings: {}
Traceback (most recent call last):
File "G:/Python/ParkingSearch/entry_point.py", line 16, in
main()
File "G:/Python/ParkingSearch/entry_point.py", line 11, in main
process.crawl(go_and_see_aus_spider)
File "D:\Program Files\Python\Python35-32\lib\site-packages\scrapy\crawler.py", line 150, in crawl
crawler = self._create_crawler(crawler_or_spidercls)
File "D:\Program Files\Python\Python35-32\lib\site-packages\scrapy\crawler.py", line 166, in _create_crawler
return Crawler(spidercls, self.settings)
File "D:\Program Files\Python\Python35-32\lib\site-packages\scrapy\crawler.py", line 32, in init
self.spidercls.update_settings(self.settings)
AttributeError: module 'crawler.spiders.test_spider' has no attribute 'update_settings'

@kmike
Copy link
Member

kmike commented Mar 7, 2016

@DeckerCHAN could you please provide an example code which fails with this error?

@DeckerCHAN
Copy link
Author

This is my repo. You may explore this code.

@kmike
Copy link
Member

kmike commented Mar 8, 2016

@DeckerCHAN you're passing a module to process.crawl, while it accepts either a Spider instance or a spider name. See http://doc.scrapy.org/en/latest/topics/api.html#scrapy.crawler.CrawlerProcess.crawl

@kmike kmike closed this as completed Mar 8, 2016
@DeckerCHAN
Copy link
Author

BTW, I've encountered problem when using "scrapy" command in CMD.
System Infos: Win10 Py35(32bit)
image

System variable set as below:
image

@kmike
Copy link
Member

kmike commented Mar 8, 2016

@DeckerCHAN it looks like pypa/pip#2783; before Python 3.5 default install dir was without spaces, but 3.5 is installed in Program Files. Could you try

  1. upgrading pip;
  2. upgrading Scrapy using new pip?

I don't think it can be fixed in Scrapy.

@DeckerCHAN
Copy link
Author

After re-install python at root of driver D and upgrade pip with easy_install, everything works good! Thank you for your assistance! Helps a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants