Python 2.7 exceptions.TypeError:uu init_uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu;设置';

Python 2.7 exceptions.TypeError:uu init_uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu;设置';,python-2.7,ubuntu,Python 2.7,Ubuntu,我已经从github下载了分发爬虫程序。但我发现那没用。它提供: unexpected keyword argument 'settings' 有人知道吗 完整堆栈跟踪如下所示: 2017-03-31 15:43:43 [twisted] ERROR: Unhandled error in Deferred: 2017-03-31 15:43:43 [twisted] ERROR: Unhandled Error Traceback (most recent call last): Fil

我已经从github下载了
分发爬虫程序
。但我发现那没用。它提供:

unexpected keyword argument 'settings'
有人知道吗

完整堆栈跟踪如下所示:

2017-03-31 15:43:43 [twisted] ERROR: Unhandled error in Deferred:
2017-03-31 15:43:43 [twisted] ERROR: Unhandled Error
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 163, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 167, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1237, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "/usr/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1099, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 90, in crawl
    six.reraise(*exc_info)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 72, in crawl
    self.engine = self._create_engine()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 97, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 70, in __init__
    self.scraper = Scraper(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/scraper.py", line 71, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 58, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 36, in from_settings
    mw = mwcls.from_crawler(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/media.py", line 51, in from_crawler
    pipe = cls.from_settings(crawler.settings)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/images.py", line 95, in from_settings
    return cls(store_uri, settings=settings)
exceptions.TypeError: __init__() got an unexpected keyword argument 'settings'
2017-03-31 15:43:43[twisted]错误:延迟中未处理的错误:
2017-03-31 15:43:43[twisted]错误:未处理的错误
回溯(最近一次呼叫最后一次):
文件“/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py”,第57行,正在运行
self.crawler\u process.crawl(spname,**opts.spargs)
文件“/usr/local/lib/python2.7/dist packages/scrapy/crawler.py”,第163行,在爬网中
返回自爬网(爬网程序,*args,**kwargs)
文件“/usr/local/lib/python2.7/dist packages/scrapy/crawler.py”,第167行,在爬网中
d=爬网器。爬网(*args,**kwargs)
文件“/usr/lib/python2.7/dist packages/twisted/internet/defer.py”,第1237行,在unwindGenerator中
return _inlineCallbacks(无、gen、Deferred())
---  ---
文件“/usr/lib/python2.7/dist packages/twisted/internet/defer.py”,第1099行,在内联回调中
结果=g.send(结果)
文件“/usr/local/lib/python2.7/dist packages/scrapy/crawler.py”,第90行,在爬网中
六、重放(*exc_信息)
文件“/usr/local/lib/python2.7/dist packages/scrapy/crawler.py”,第72行,在爬网中
self.engine=self.\u创建\u引擎()
文件“/usr/local/lib/python2.7/dist packages/scrapy/crawler.py”,第97行,在创建引擎中
返回ExecutionEngine(self,lambda:self.stop())
文件“/usr/local/lib/python2.7/dist packages/scrapy/core/engine.py”,第70行,在__
self.scraper=铲运机(履带式)
文件“/usr/local/lib/python2.7/dist packages/scrapy/core/scraper.py”,第71行,在__
self.itemproc=itemproc\u cls.from\u爬虫程序(爬虫程序)
文件“/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py”,第58行,来自爬虫程序
返回cls.from_设置(crawler.settings,crawler)
文件“/usr/local/lib/python2.7/dist packages/scrapy/middleware.py”,第36行,在from_设置中
mw=来自爬虫(爬虫)的mwcls
文件“/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/media.py”,第51行,来自爬虫程序
管道=cls.from\u设置(爬网器设置)
文件“/usr/local/lib/python2.7/dist packages/scrapy/pipelines/images.py”,第95行,在from_设置中
返回cls(存储uri,设置=设置)
exceptions.TypeError:\uuuu init\uuuuuuuuu()获得意外的关键字参数“settings”

请链接github存储库好吗?这可能是您需要与开发人员讨论的问题。是否可以指定github存储库url?谢谢:)