Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/314.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 将Scrapy与Google云存储一起用作提要导出的问题_Python_Python Requests_Scrapy_Google Cloud Storage_Urllib3 - Fatal编程技术网

Python 将Scrapy与Google云存储一起用作提要导出的问题

Python 将Scrapy与Google云存储一起用作提要导出的问题,python,python-requests,scrapy,google-cloud-storage,urllib3,Python,Python Requests,Scrapy,Google Cloud Storage,Urllib3,我使用GCS作为饲料出口的刮屑根据。奇怪的是,它有时确实有效 但其他时候,它会在上传时失败,我唯一能看到的不同之处是它试图上传更多的数据。话虽如此,它还是失败了,上传了大约60Mb的数据,这让我怀疑数据的规模是否真的是一个问题。有人能解释一下这是我的配置问题还是Scrapy本身的问题吗?错误报告如下: 2020-12-01 23:07:26 [scrapy.extensions.feedexport] ERROR: Error storing csv feed (19826 items) in:

我使用GCS作为饲料出口的刮屑根据。奇怪的是,它有时确实有效

但其他时候,它会在上传时失败,我唯一能看到的不同之处是它试图上传更多的数据。话虽如此,它还是失败了,上传了大约60Mb的数据,这让我怀疑数据的规模是否真的是一个问题。有人能解释一下这是我的配置问题还是Scrapy本身的问题吗?错误报告如下:

2020-12-01 23:07:26 [scrapy.extensions.feedexport] ERROR: Error storing csv feed (19826 items) in: gs://instoxi_amazon/com/Ngolo/Amazon_Beauty_&_Personal_Care_Ngolo.csv
Traceback (most recent call last):
  File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 600, in urlopen
    chunked=chunked)
  File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 354, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "C:\ProgramData\Anaconda3\lib\http\client.py", line 1244, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "C:\ProgramData\Anaconda3\lib\http\client.py", line 1290, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "C:\ProgramData\Anaconda3\lib\http\client.py", line 1239, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "C:\ProgramData\Anaconda3\lib\http\client.py", line 1065, in _send_output
    self.send(chunk)
  File "C:\ProgramData\Anaconda3\lib\http\client.py", line 987, in send
    self.sock.sendall(data)
  File "C:\ProgramData\Anaconda3\lib\ssl.py", line 1034, in sendall
    v = self.send(byte_view[count:])
  File "C:\ProgramData\Anaconda3\lib\ssl.py", line 1003, in send
    return self._sslobj.write(data)
ssl.SSLWantWriteError: The operation did not complete (write) (_ssl.c:2361)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\ProgramData\Anaconda3\lib\site-packages\requests\adapters.py", line 449, in send
    timeout=timeout
  File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 638, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\util\retry.py", line 399, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='storage.googleapis.com', port=443): Max retries exceeded with url: /upload/storage/v1/b/instoxi_amazon/o?uploadType=resumable&upload_id=ABg5-Uwjc9Vs5HdgyQdhTTm0ph3N_xQIoZaAE44Oiv2MdMO6q-YhD31eRkWO6W7UNAlehUKm4FTgVv0KXq32SHmCrDU (Caused by SSLError(SSLWantWriteError(3, 'The operation did not complete (write) (_ssl.c:2361)')))

这是我的第一个问题,请告诉我是否有更好的提问/陈述方式。我只是想澄清一下,在使用Python与Scrapy之外的GCS进行交互时,我没有遇到任何问题。干杯

我以前看到
操作没有完成(写入)(\u ssl.c:2361)
,这是由于网络问题。这也符合这样一个事实,即它对你来说是不一致的。如果可以的话,我建议您尝试另一个网络连接到internet


尽管如此,我还是建议您确保使用的是最新版本的Scrapy

谢谢,让我用有线连接进行测试,看看是否更可靠。好的,我刚刚升级了Scrapy-google feed export是一个相当新的功能,所以这可能是个好主意。你能解决你的问题吗?我是!关于我的互联网问题,你是对的。我的上传速度太慢,无法进行操作。我链接的问题为我的问题提供了一个很好的解决方案(对于其他感兴趣的人):