在python3.x上启动scrapy项目时出现一些错误

在python3.x上启动scrapy项目时出现一些错误,python,python-3.x,scrapy,twisted,Python,Python 3.x,Scrapy,Twisted,我知道Scrapy1.1.0完全支持Python3.x,我已经成功创建了一个项目。但是当我尝试启动程序时:scrapy crawl dmoz我得到了以下错误: python version=3.5.1 scrapy version=1.1.0 rc1 Twisted version =16.1.0(scrapy said Twisted version just need>=15.5) 2016-04-09 21:15:17[scrapy]信息:scrapy 1.1.0rc1已启动(

我知道
Scrapy1.1.0
完全支持Python3.x,我已经成功创建了一个项目。但是当我尝试启动程序时:
scrapy crawl dmoz
我得到了以下错误:

python version=3.5.1

scrapy version=1.1.0 rc1

Twisted version =16.1.0(scrapy said Twisted version just need>=15.5)
2016-04-09 21:15:17[scrapy]信息:scrapy 1.1.0rc1已启动(机器人:doub)
2016-04-09 21:15:17[scrapy]信息:覆盖的设置:{'BOT_NAME':'doub','SPIDER_MODULES':['doub.SPIDER'],'NEWSPIDER_MODULE':'doub.SPIDER','ROBOTSTXT_-obe':True}
2016-04-09 21:15:18[scrapy]信息:启用的扩展:
['scrapy.extensions.logstats.logstats','scrapy.extensions.corestats.corestats']
延迟中未处理的错误:
2016-04-09 21:15:18[twisted]严重:延迟中未处理的错误:
回溯(最近一次呼叫最后一次):
文件“d:\python\python35-32\lib\site packages\scrapy\commands\crawl.py”,第57行,正在运行
self.crawler\u process.crawl(spname,**opts.spargs)
文件“d:\python\python35-32\lib\site packages\scrapy\crawler.py”,第152行,在爬网中
返回自爬网(爬网程序,*args,**kwargs)
文件“d:\python\python35-32\lib\site packages\scrapy\crawler.py”,第156行,在爬网中
d=爬网器。爬网(*args,**kwargs)
文件“d:\python\python35-32\lib\site packages\twisted\internet\defer.py”,第1274行,在unwindGenerator中
return _inlineCallbacks(无、gen、Deferred())
---  ---
文件“d:\python\python35-32\lib\site packages\twisted\internet\defer.py”,第1126行,在\u inlineCallbacks中
结果=结果。通过ExceptionToGenerator(g)
文件“d:\python\python35-32\lib\site packages\twisted\python\failure.py”,第389行,位于ThroweExceptionToGenerator中
返回g.throw(self.type、self.value、self.tb)
文件“d:\python\python35-32\lib\site packages\scrapy\crawler.py”,第80行,在爬网中
收益交换
builtins.ImportError:无法导入名称“\u win32stdio”
2016-04-09 21:15:18[扭曲]关键:

我也遇见了这个问题,我是这样解决的:
pip安装——升级scrapy直到现在为止,scrapy在Windows上不支持Python 3,而是在linux上支持。我必须在Ubuntu上运行
scrapy

我们能看到代码吗?请用英语回答。Stackoverflow是一个全球网络。
2016-04-09 21:15:17 [scrapy] INFO: Scrapy 1.1.0rc1 started (bot: doub)
2016-04-09 21:15:17 [scrapy] INFO: Overridden settings: {'BOT_NAME': 'doub', 'SPIDER_MODULES': ['doub.spiders'], 'NEWSPIDER_MODULE': 'doub.spiders', 'ROBOTSTXT_OBEY': True}
2016-04-09 21:15:18 [scrapy] INFO: Enabled extensions:
['scrapy.extensions.logstats.LogStats', 'scrapy.extensions.corestats.CoreStats']
Unhandled error in Deferred:
2016-04-09 21:15:18 [twisted] CRITICAL: Unhandled error in Deferred:


Traceback (most recent call last):
  File "d:\python\python35-32\lib\site-packages\scrapy\commands\crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "d:\python\python35-32\lib\site-packages\scrapy\crawler.py", line 152, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "d:\python\python35-32\lib\site-packages\scrapy\crawler.py", line 156, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "d:\python\python35-32\lib\site-packages\twisted\internet\defer.py", line 1274, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "d:\python\python35-32\lib\site-packages\twisted\internet\defer.py", line 1126, in _inlineCallbacks
    result = result.throwExceptionIntoGenerator(g)
  File "d:\python\python35-32\lib\site-packages\twisted\python\failure.py", line 389, in throwExceptionIntoGenerator
    return g.throw(self.type, self.value, self.tb)
  File "d:\python\python35-32\lib\site-packages\scrapy\crawler.py", line 80, in crawl
    yield exc
builtins.ImportError: cannot import name '_win32stdio'
2016-04-09 21:15:18 [twisted] CRITICAL: