New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot import '_win32stdio' (but pywin32 is already installed) #1998

Open
RussBaz opened this Issue May 19, 2016 · 6 comments

Comments

Projects
None yet
6 participants
@RussBaz
Copy link

RussBaz commented May 19, 2016

I am using Python 3.5.1 (64 bit), Windows 10, VS 2015 Update 2. lxml (3.6.0) and pywin32 (220.1) are installed. Scrapy (1.1.0) was installed successfully. Then, when I run an example from 'http://doc.scrapy.org/en/latest/intro/overview.html' in my virtual environment, I get the following exception:

(env) D:\Projects\tscrapy> scrapy runspider stackoverflow_spider.py -o top-stackoverflow-questions.json
2016-05-19 17:36:00 [scrapy] INFO: Scrapy 1.1.0 started (bot: scrapybot)
2016-05-19 17:36:00 [scrapy] INFO: Overridden settings: {'FEED_URI': 'top-stackoverflow-questions.json', 'FEED_FORMAT': 'json'}
2016-05-19 17:36:00 [scrapy] INFO: Enabled extensions:
['scrapy.extensions.logstats.LogStats',
 'scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.feedexport.FeedExporter']
Unhandled error in Deferred:
2016-05-19 17:36:00 [twisted] CRITICAL: Unhandled error in Deferred:


Traceback (most recent call last):
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\commands\runspider.py", line 87, in run
    self.crawler_process.crawl(spidercls, **opts.spargs)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\crawler.py", line 163, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\crawler.py", line 167, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\internet\defer.py", line 1274, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\internet\defer.py", line 1128, in _inlineCallbacks
    result = g.send(result)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\crawler.py", line 72, in crawl
    self.engine = self._create_engine()
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\crawler.py", line 97, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\core\engine.py", line 68, in __init__
    self.downloader = downloader_cls(crawler)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\core\downloader\__init__.py", line 88, in __init__
    self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\middleware.py", line 58, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\utils\misc.py", line 44, in load_object
    mod = import_module(module)
  File "C:\Program Files\Python 3.5\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 986, in _gcd_import

  File "<frozen importlib._bootstrap>", line 969, in _find_and_load

  File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked

  File "<frozen importlib._bootstrap>", line 673, in _load_unlocked

  File "<frozen importlib._bootstrap_external>", line 662, in exec_module

  File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed

  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\downloadermiddlewares\retry.py", line 23, in <module>
    from scrapy.xlib.tx import ResponseFailed
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\xlib\tx\__init__.py", line 3, in <module>
    from twisted.web import client
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\web\client.py", line 41, in <module>
    from twisted.internet.endpoints import TCP4ClientEndpoint, SSL4ClientEndpoint
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\internet\endpoints.py", line 34, in <module>
    from twisted.internet.stdio import StandardIO, PipeAddress
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\internet\stdio.py", line 30, in <module>
    from twisted.internet import _win32stdio
builtins.ImportError: cannot import name '_win32stdio'
2016-05-19 17:36:00 [twisted] CRITICAL:
Traceback (most recent call last):
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\internet\defer.py", line 1128, in _inlineCallbacks
    result = g.send(result)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\crawler.py", line 72, in crawl
    self.engine = self._create_engine()
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\crawler.py", line 97, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\core\engine.py", line 68, in __init__
    self.downloader = downloader_cls(crawler)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\core\downloader\__init__.py", line 88, in __init__
    self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\middleware.py", line 58, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\utils\misc.py", line 44, in load_object
    mod = import_module(module)
  File "C:\Program Files\Python 3.5\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 986, in _gcd_import
  File "<frozen importlib._bootstrap>", line 969, in _find_and_load
  File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 673, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 662, in exec_module
  File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\downloadermiddlewares\retry.py", line 23, in <module>
    from scrapy.xlib.tx import ResponseFailed
  File "d:\projects\tscrapy\env\lib\site-packages\scrapy\xlib\tx\__init__.py", line 3, in <module>
    from twisted.web import client
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\web\client.py", line 41, in <module>
    from twisted.internet.endpoints import TCP4ClientEndpoint, SSL4ClientEndpoint
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\internet\endpoints.py", line 34, in <module>
    from twisted.internet.stdio import StandardIO, PipeAddress
  File "d:\projects\tscrapy\env\lib\site-packages\twisted\internet\stdio.py", line 30, in <module>
    from twisted.internet import _win32stdio
ImportError: cannot import name '_win32stdio'
@kmike

This comment has been minimized.

Copy link
Member

kmike commented May 19, 2016

Hey @RussBaz,

Unfortunately twisted.internet._win32stdio is not ported to Python 3 yet: https://twistedmatrix.com/trac/ticket/8018; it means Scrapy doesn't support Python 3 on Windows at the moment.

@ghost

This comment has been minimized.

Copy link

ghost commented May 22, 2016

You can get scrapy to work if you copy the _win23stdio and _pollingfile from the current repository. Of course, it's not official.

@aprotopopov

This comment has been minimized.

Copy link

aprotopopov commented Jul 21, 2016

As I understand now for Python 3.x it could be fixed by installing twisted-win which is developed by @xoviat

@rmax

This comment has been minimized.

Copy link
Contributor

rmax commented Dec 22, 2016

I know that scrapy + python3 works in windows if you install it via conda: conda install scrapy -c conda-forge.

@ghost

This comment has been minimized.

Copy link

ghost commented Dec 22, 2016

@rolando twisted-win has been removed from pypi because it is no longer required. scrapy now works by default with pip install scrapy because the twisted package has been updated.

@M-Younus

This comment has been minimized.

Copy link

M-Younus commented Jul 30, 2018

get whl of twisted (from below link) according to your os and py version and you are good to go!
https://www.lfd.uci.edu/~gohlke/pythonlibs/#twisted

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment