Python 3.x 如何在GCP中防止http python请求中的对等方重置连接

Python 3.x 如何在GCP中防止http python请求中的对等方重置连接,python-3.x,http,google-cloud-platform,google-cloud-functions,Python 3.x,Http,Google Cloud Platform,Google Cloud Functions,以下请求导致连接重置错误。这并不是每次都失败,而是偶尔发生。任何关于如何防止这种情况的想法都是值得赞赏的 http.request("https://dataflow.googleapis.com/v1b3/projects/%s/templates:launch?gcsPath=%s&location=us-central1" % (project, DATAFLOW_SPANNER_EXPORT), method="POST", headers={'A

以下请求导致连接重置错误。这并不是每次都失败,而是偶尔发生。任何关于如何防止这种情况的想法都是值得赞赏的

http.request("https://dataflow.googleapis.com/v1b3/projects/%s/templates:launch?gcsPath=%s&location=us-central1" % (project, DATAFLOW_SPANNER_EXPORT),
        method="POST",
        headers={'Accept': 'application/json', 'Content-Type': 'application/json'},
        body=body )
这是正在接收的错误。我们的函数名为create_panner_export,它正在启动一个数据流作业。这并不是每次都失败,而是偶尔发生。失败时,数据流作业不会启动

 File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 383, in run_background_function
    _function_handler.invoke_user_function(event_object)
  File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 217, in invoke_user_function
    return call_user_function(request_or_event)
  File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 214, in call_user_function
    event_context.Context(**request_or_event.context))
  File "/user_code/main.py", line 678, in database_export_main
    handle_export(logger, attrib, max_exports, output_dir)
  File "/user_code/main.py", line 526, in handle_export
    spanner_export(logger, attrib, max_exports, output_dir)
  File "/user_code/main.py", line 342, in spanner_export
    for d in database_list[:max_exports]]
  File "/user_code/main.py", line 342, in <listcomp>
    for d in database_list[:max_exports]]
  File "/user_code/main.py", line 274, in create_spanner_export
    body=body )
  File "/env/local/lib/python3.7/site-packages/oauth2client/transport.py", line 175, in new_request
    redirections, connection_type)
  File "/env/local/lib/python3.7/site-packages/oauth2client/transport.py", line 282, in request
    connection_type=connection_type)
  File "/env/local/lib/python3.7/site-packages/httplib2/__init__.py", line 1953, in request
    cachekey,
  File "/env/local/lib/python3.7/site-packages/httplib2/__init__.py", line 1618, in _request
    conn, request_uri, method, body, headers
  File "/env/local/lib/python3.7/site-packages/httplib2/__init__.py", line 1525, in _conn_request
    conn.request(method, request_uri, body, headers)
  File "/opt/python3.7/lib/python3.7/http/client.py", line 1229, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/opt/python3.7/lib/python3.7/http/client.py", line 1275, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/opt/python3.7/lib/python3.7/http/client.py", line 1224, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/opt/python3.7/lib/python3.7/http/client.py", line 1055, in _send_output
    self.send(chunk)
  File "/opt/python3.7/lib/python3.7/http/client.py", line 977, in send
    self.sock.sendall(data)
  File "/opt/python3.7/lib/python3.7/ssl.py", line 1015, in sendall
    v = self.send(byte_view[count:])
  File "/opt/python3.7/lib/python3.7/ssl.py", line 984, in send
    return self._sslobj.write(data)
ConnectionResetError: [Errno 104] Connection reset by peer
File“/env/local/lib/python3.7/site packages/google/cloud/functions/worker.py”,第383行,在run\u background\u函数中
_函数\处理程序。调用\用户\函数(事件\对象)
文件“/env/local/lib/python3.7/site packages/google/cloud/functions/worker.py”,第217行,在invoke\u user\u函数中
返回调用用户函数(请求或事件)
文件“/env/local/lib/python3.7/site packages/google/cloud/functions/worker.py”,第214行,在call\u user\u函数中
事件上下文(**请求上下文或事件上下文))
文件“/user\u code/main.py”,第678行,在数据库\u export\u main中
句柄导出(记录器、属性、最大导出、输出目录)
文件“/user\u code/main.py”,第526行,在句柄导出中
扳手导出(记录器、属性、最大导出、输出目录)
文件“/user\u code/main.py”,第342行,在扳手导出中
对于数据库_列表中的d[:max_exports]]
文件“/user\u code/main.py”,第342行,在
对于数据库_列表中的d[:max_exports]]
文件“/user\u code/main.py”,第274行,在创建扳手导出中
车身=车身)
文件“/env/local/lib/python3.7/site packages/oauth2client/transport.py”,第175行,在新请求中
重定向、连接(U类型)
请求中第282行的文件“/env/local/lib/python3.7/site packages/oauth2client/transport.py”
连接类型=连接类型)
文件“/env/local/lib/python3.7/site packages/httplib2/_init__.py”,第1953行,在请求中
cachekey,
文件“/env/local/lib/python3.7/site packages/httplib2/_init__.py”,第1618行,在请求中
conn,请求uri,方法,正文,头
文件“/env/local/lib/python3.7/site packages/httplib2/_init__.py”,第1525行,在连接请求中
conn.request(方法、请求uri、正文、头)
请求中的文件“/opt/python3.7/lib/python3.7/http/client.py”,第1229行
self.\u发送\u请求(方法、url、正文、标题、编码\u分块)
文件“/opt/python3.7/lib/python3.7/http/client.py”,第1275行,在发送请求中
self.endheaders(body,encode\u chunked=encode\u chunked)
文件“/opt/python3.7/lib/python3.7/http/client.py”,第1224行,在endheaders中
self.\u发送\u输出(消息体,encode\u chunked=encode\u chunked)
文件“/opt/python3.7/lib/python3.7/http/client.py”,第1055行,在发送输出中
self.send(块)
文件“/opt/python3.7/lib/python3.7/http/client.py”,第977行,在send中
self.sock.sendall(数据)
文件“/opt/python3.7/lib/python3.7/ssl.py”,第1015行,在sendall中
v=self.send(字节\视图[计数:)
文件“/opt/python3.7/lib/python3.7/ssl.py”,第984行,在send中
返回self.\u sslobj.write(数据)
ConnectionResetError:[Errno 104]对等方重置连接

使用重试应该有效:

@retry(停止=尝试后停止(3),等待=随机等待(最小值=1,最大值=2))
def sendReq(…):
http.request(“https://dataflow.googleapis.com/v1b3/projects/%s/templates:launch?gcsPath=%s&location=us-central1“%(项目、数据流\u扳手\u导出),
method=“POST”,
headers={'Accept':'application/json','Content Type':'application/json'},
车身=车身)

请参阅此处的详细信息:

使用重试应该有效:

@retry(停止=尝试后停止(3),等待=随机等待(最小值=1,最大值=2))
def sendReq(…):
http.request(“https://dataflow.googleapis.com/v1b3/projects/%s/templates:launch?gcsPath=%s&location=us-central1“%(项目、数据流\u扳手\u导出),
method=“POST”,
headers={'Accept':'application/json','Content Type':'application/json'},
车身=车身)

请参阅此处的更多信息:

有关应用程序的更多信息。。。应用程序代码通过云函数调用/创建数据流作业以导出扳手数据库。所以,这段代码在一个云函数中。每天备份一次的数据库少于15个,每次调用此备份数据流例程之间有5分钟的时间间隔。数据库的大小相对来说非常小,每个数据库都不到5MB。看起来最多有两个导出数据流作业同时运行一段时间。云功能不会等待导出数据流作业完成。有关应用程序的详细信息。。。应用程序代码通过云函数调用/创建数据流作业以导出扳手数据库。所以,这段代码在一个云函数中。每天备份一次的数据库少于15个,每次调用此备份数据流例程之间有5分钟的时间间隔。数据库的大小相对来说非常小,每个数据库都不到5MB。看起来最多有两个导出数据流作业同时运行一段时间。云功能不会等待导出数据流作业完成。