Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/283.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
在python中下载任何URL时出错_Python_Python 2.7_Scrapy - Fatal编程技术网

在python中下载任何URL时出错

在python中下载任何URL时出错,python,python-2.7,scrapy,Python,Python 2.7,Scrapy,我用谷歌链接运行了简单的scrapy spider,它提供了hello的搜索结果,但有一个错误 代码(蜘蛛代码) 错误是 2017-02-26 18:06:11 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023 2017-02-26 18:06:11 [scrapy] ERROR: Error downloading <GET http://www.google.com/> Traceback (most recen

我用谷歌链接运行了简单的scrapy spider,它提供了hello的搜索结果,但有一个错误

代码(蜘蛛代码)

错误是

2017-02-26 18:06:11 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
2017-02-26 18:06:11 [scrapy] ERROR: Error downloading <GET http://www.google.com/>
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/scrapy/utils/defer.py", line 45, in mustbe_deferred
    result = f(*args, **kw)
  File "/usr/lib/python2.7/dist-packages/scrapy/core/downloader/handlers/__init__.py", line 41, in download_request
    return handler(request, spider)
  File "/usr/lib/python2.7/dist-packages/scrapy/core/downloader/handlers/http11.py", line 44, in download_request
    return agent.download_request(request)
  File "/usr/lib/python2.7/dist-packages/scrapy/core/downloader/handlers/http11.py", line 211, in download_request
    d = agent.request(method, url, headers, bodyproducer)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/client.py", line 1631, in request
    parsedURI.originForm)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/client.py", line 1408, in _requestWithEndpoint
    d = self._pool.getConnection(key, endpoint)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/client.py", line 1294, in getConnection
    return self._newConnection(key, endpoint)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/client.py", line 1306, in _newConnection
    return endpoint.connect(factory)
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/endpoints.py", line 788, in connect
    EndpointReceiver, self._hostText, portNumber=self._port
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/_resolver.py", line 174, in resolveHostName
    onAddress = self._simpleResolver.getHostByName(hostName)
  File "/usr/lib/python2.7/dist-packages/scrapy/resolver.py", line 21, in getHostByName
    d = super(CachingThreadedResolver, self).getHostByName(name, timeout)
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/base.py", line 276, in getHostByName
    timeoutDelay = sum(timeout)
TypeError: 'float' object is not iterable
2017-02-26 18:06:11 [scrapy] INFO: Closing spider (finished)
2017-02-26 18:06:11 [scrapy] INFO: Dumping Scrapy stats:
2017-02-26 18:06:11[scrapy]调试:Telnet控制台监听127.0.0.1:6023
2017-02-26 18:06:11[scrapy]错误:下载错误
回溯(最近一次呼叫最后一次):
文件“/usr/lib/python2.7/dist packages/scrapy/utils/defer.py”,第45行,必须延迟
结果=f(*参数,**kw)
下载请求中的第41行文件“/usr/lib/python2.7/dist packages/scrapy/core/downloader/handlers/_init__.py”
返回处理程序(请求,spider)
下载请求中的文件“/usr/lib/python2.7/dist packages/scrapy/core/downloader/handlers/http11.py”,第44行
返回代理。下载请求(请求)
下载请求中的文件“/usr/lib/python2.7/dist packages/scrapy/core/downloader/handlers/http11.py”,第211行
d=agent.request(方法、url、标题、bodyproducer)
请求中的文件“/usr/local/lib/python2.7/dist-packages/twisted/web/client.py”,第1631行
parsedURI.原件)
文件“/usr/local/lib/python2.7/dist packages/twisted/web/client.py”,第1408行,在_requestWithEndpoint中
d=self.\u pool.getConnection(键,端点)
文件“/usr/local/lib/python2.7/dist-packages/twisted/web/client.py”,第1294行,位于getConnection中
返回self.\u新连接(键,端点)
文件“/usr/local/lib/python2.7/dist-packages/twisted/web/client.py”,第1306行,在新连接中
返回端点.connect(工厂)
文件“/usr/local/lib/python2.7/dist-packages/twisted/internet/endpoints.py”,第788行,在connect中
EndpointReceiver,self.\u主机文本,端口号=self.\u端口
resolveHostName中的文件“/usr/local/lib/python2.7/dist packages/twisted/internet/_resolver.py”,第174行
onAddress=self.\u simpleResolver.getHostByName(主机名)
文件“/usr/lib/python2.7/dist packages/scrapy/resolver.py”,第21行,在getHostByName中
d=super(CachingThreadedResolver,self).getHostByName(名称,超时)
文件“/usr/local/lib/python2.7/dist-packages/twisted/internet/base.py”,第276行,位于getHostByName中
timeoutDelay=总和(超时)
TypeError:“float”对象不可编辑
2017-02-26 18:06:11[scrapy]信息:关闭卡盘(已完成)
2017-02-26 18:06:11[scrapy]信息:倾销scrapy统计数据:
请帮我解决这个问题,我有ubuntu 16.10,我发现了这个问题。
twisted的版本太高了,您可以将其更改为16.6.0,并且工作成功

请包括完整的代码。我们无法运行您提供的代码并获得相同的结果。我使用
startproject链接创建了Scrapy项目
,还使用
genspider lins
创建了spider,而
lins.py
文件的代码是我在问题中编写的
2017-02-26 18:06:11 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
2017-02-26 18:06:11 [scrapy] ERROR: Error downloading <GET http://www.google.com/>
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/scrapy/utils/defer.py", line 45, in mustbe_deferred
    result = f(*args, **kw)
  File "/usr/lib/python2.7/dist-packages/scrapy/core/downloader/handlers/__init__.py", line 41, in download_request
    return handler(request, spider)
  File "/usr/lib/python2.7/dist-packages/scrapy/core/downloader/handlers/http11.py", line 44, in download_request
    return agent.download_request(request)
  File "/usr/lib/python2.7/dist-packages/scrapy/core/downloader/handlers/http11.py", line 211, in download_request
    d = agent.request(method, url, headers, bodyproducer)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/client.py", line 1631, in request
    parsedURI.originForm)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/client.py", line 1408, in _requestWithEndpoint
    d = self._pool.getConnection(key, endpoint)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/client.py", line 1294, in getConnection
    return self._newConnection(key, endpoint)
  File "/usr/local/lib/python2.7/dist-packages/twisted/web/client.py", line 1306, in _newConnection
    return endpoint.connect(factory)
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/endpoints.py", line 788, in connect
    EndpointReceiver, self._hostText, portNumber=self._port
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/_resolver.py", line 174, in resolveHostName
    onAddress = self._simpleResolver.getHostByName(hostName)
  File "/usr/lib/python2.7/dist-packages/scrapy/resolver.py", line 21, in getHostByName
    d = super(CachingThreadedResolver, self).getHostByName(name, timeout)
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/base.py", line 276, in getHostByName
    timeoutDelay = sum(timeout)
TypeError: 'float' object is not iterable
2017-02-26 18:06:11 [scrapy] INFO: Closing spider (finished)
2017-02-26 18:06:11 [scrapy] INFO: Dumping Scrapy stats: