Python 数据流作业在BigQuery写入时失败,并出现后端错误

Python 数据流作业在BigQuery写入时失败,并出现后端错误,python,google-bigquery,google-cloud-dataflow,Python,Google Bigquery,Google Cloud Dataflow,我有一个作业失败了,与最终导入BigQuery相关的几个不同错误。我已经运行了5次,每次都失败,尽管错误消息有时会有所不同。当我在本地针对SQLite数据库运行它时,这项工作运行得很好,所以我认为问题出在Google后端 一条错误消息: **Workflow failed. Causes: S04:write meter_traces_combined to BigQuery/WriteToBigQuery/NativeWrite failed., BigQuery import job "da

我有一个作业失败了,与最终导入BigQuery相关的几个不同错误。我已经运行了5次,每次都失败,尽管错误消息有时会有所不同。当我在本地针对SQLite数据库运行它时,这项工作运行得很好,所以我认为问题出在Google后端

一条错误消息:

**Workflow failed. Causes: S04:write meter_traces_combined to BigQuery/WriteToBigQuery/NativeWrite failed., BigQuery import job "dataflow_job_5111748333716803539" failed., BigQuery creation of import job for table "meter_traces_combined" in dataset "ebce" in project "oeem-ebce-platform" failed., BigQuery execution failed., Unknown error.**

    raceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 178, in execute
    op.finish()
  File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/nativefileio.py", line 465, in __exit__
    self.file.close()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 217, in close
    self._uploader.finish()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 588, in finish
    raise self._upload_thread.last_error  # pylint: disable=raising-bad-type
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload
    self._client.objects.Insert(self._insert_request, upload=self._upload)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1154, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod
    http_request, client=self.client)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload
    return self.StreamInChunks()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia
    self.RefreshResumableUploadState()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 873, in RefreshResumableUploadState
    self.stream.seek(self.progress)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 301, in seek
    offset, whence, self.position, self.last_block_position))
NotImplementedError: offset: 10485760, whence: 0, position: 16777216, last: 8388608

 Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 178, in execute
    op.finish()
  File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/nativeavroio.py", line 309, in __exit__
    self._data_file_writer.fo.close()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 217, in close
    self._uploader.finish()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 588, in finish
    raise self._upload_thread.last_error  # pylint: disable=raising-bad-type
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload
    self._client.objects.Insert(self._insert_request, upload=self._upload)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1154, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod
    http_request, client=self.client)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload
    return self.StreamInChunks()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia
    self.RefreshResumableUploadState()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 875, in RefreshResumableUploadState
    raise exceptions.HttpError.FromResponse(refresh_response)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/oee-ebce-platform/o?alt=json&name=tmp%2Fetl-ebce-combine-all-traces-20191127-152244.1574868164.604684%2Fdax-tmp-2019-11-27_07_24_36-17060579636924315582-S02-0-e425da41c3fe2598%2Ftmp-e425da41c3fe2d8b-shard--try-33835bf582552bbd-endshard.avro&uploadType=resumable&upload_id=AEnB2UqddXXpTnnRQyxBQuL1ptXExVZ5CrUQ33o2S2UHcVUhesrBq7XFSQ90YBQznRm2Wh3g8g8lG1z5uEQv8fXvqO40z5WrnQ>: response: <{'x-guploader-uploadid': 'AEnB2UqddXXpTnnRQyxBQuL1ptXExVZ5CrUQ33o2S2UHcVUhesrBq7XFSQ90YBQznRm2Wh3g8g8lG1z5uEQv8fXvqO40z5WrnQ', 'vary': 'Origin, X-Origin', 'content-type': 'application/json; charset=UTF-8', 'content-length': '177', 'date': 'Wed, 27 Nov 2019 15:30:50 GMT', 'server': 'UploadServer', 'status': '410'}>, content <{
 "error": {
  "errors": [
   {
    "domain": "global",
    "reason": "backendError",
    "message": "Backend Error"
   }
  ],
  "code": 503,
  "message": "Backend Error"
 }
}
另一个错误消息:

**Workflow failed. Causes: S04:write meter_traces_combined to BigQuery/WriteToBigQuery/NativeWrite failed., BigQuery import job "dataflow_job_5111748333716803539" failed., BigQuery creation of import job for table "meter_traces_combined" in dataset "ebce" in project "oeem-ebce-platform" failed., BigQuery execution failed., Unknown error.**

    raceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 178, in execute
    op.finish()
  File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/nativefileio.py", line 465, in __exit__
    self.file.close()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 217, in close
    self._uploader.finish()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 588, in finish
    raise self._upload_thread.last_error  # pylint: disable=raising-bad-type
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload
    self._client.objects.Insert(self._insert_request, upload=self._upload)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1154, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod
    http_request, client=self.client)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload
    return self.StreamInChunks()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia
    self.RefreshResumableUploadState()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 873, in RefreshResumableUploadState
    self.stream.seek(self.progress)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 301, in seek
    offset, whence, self.position, self.last_block_position))
NotImplementedError: offset: 10485760, whence: 0, position: 16777216, last: 8388608

 Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 178, in execute
    op.finish()
  File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/nativeavroio.py", line 309, in __exit__
    self._data_file_writer.fo.close()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 217, in close
    self._uploader.finish()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 588, in finish
    raise self._upload_thread.last_error  # pylint: disable=raising-bad-type
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload
    self._client.objects.Insert(self._insert_request, upload=self._upload)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1154, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod
    http_request, client=self.client)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload
    return self.StreamInChunks()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia
    self.RefreshResumableUploadState()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 875, in RefreshResumableUploadState
    raise exceptions.HttpError.FromResponse(refresh_response)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/oee-ebce-platform/o?alt=json&name=tmp%2Fetl-ebce-combine-all-traces-20191127-152244.1574868164.604684%2Fdax-tmp-2019-11-27_07_24_36-17060579636924315582-S02-0-e425da41c3fe2598%2Ftmp-e425da41c3fe2d8b-shard--try-33835bf582552bbd-endshard.avro&uploadType=resumable&upload_id=AEnB2UqddXXpTnnRQyxBQuL1ptXExVZ5CrUQ33o2S2UHcVUhesrBq7XFSQ90YBQznRm2Wh3g8g8lG1z5uEQv8fXvqO40z5WrnQ>: response: <{'x-guploader-uploadid': 'AEnB2UqddXXpTnnRQyxBQuL1ptXExVZ5CrUQ33o2S2UHcVUhesrBq7XFSQ90YBQznRm2Wh3g8g8lG1z5uEQv8fXvqO40z5WrnQ', 'vary': 'Origin, X-Origin', 'content-type': 'application/json; charset=UTF-8', 'content-length': '177', 'date': 'Wed, 27 Nov 2019 15:30:50 GMT', 'server': 'UploadServer', 'status': '410'}>, content <{
 "error": {
  "errors": [
   {
    "domain": "global",
    "reason": "backendError",
    "message": "Backend Error"
   }
  ],
  "code": 503,
  "message": "Backend Error"
 }
}
和另一个错误消息:

**Workflow failed. Causes: S04:write meter_traces_combined to BigQuery/WriteToBigQuery/NativeWrite failed., BigQuery import job "dataflow_job_5111748333716803539" failed., BigQuery creation of import job for table "meter_traces_combined" in dataset "ebce" in project "oeem-ebce-platform" failed., BigQuery execution failed., Unknown error.**

    raceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 178, in execute
    op.finish()
  File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/nativefileio.py", line 465, in __exit__
    self.file.close()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 217, in close
    self._uploader.finish()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 588, in finish
    raise self._upload_thread.last_error  # pylint: disable=raising-bad-type
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload
    self._client.objects.Insert(self._insert_request, upload=self._upload)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1154, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod
    http_request, client=self.client)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload
    return self.StreamInChunks()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia
    self.RefreshResumableUploadState()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 873, in RefreshResumableUploadState
    self.stream.seek(self.progress)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 301, in seek
    offset, whence, self.position, self.last_block_position))
NotImplementedError: offset: 10485760, whence: 0, position: 16777216, last: 8388608

 Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 178, in execute
    op.finish()
  File "dataflow_worker/native_operations.py", line 93, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 94, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 95, in dataflow_worker.native_operations.NativeWriteOperation.finish
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/nativeavroio.py", line 309, in __exit__
    self._data_file_writer.fo.close()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", line 217, in close
    self._uploader.finish()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 588, in finish
    raise self._upload_thread.last_error  # pylint: disable=raising-bad-type
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", line 565, in _start_upload
    self._client.objects.Insert(self._insert_request, upload=self._upload)
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py", line 1154, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", line 715, in _RunMethod
    http_request, client=self.client)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 908, in InitializeUpload
    return self.StreamInChunks()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 971, in __StreamMedia
    self.RefreshResumableUploadState()
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/transfer.py", line 875, in RefreshResumableUploadState
    raise exceptions.HttpError.FromResponse(refresh_response)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/oee-ebce-platform/o?alt=json&name=tmp%2Fetl-ebce-combine-all-traces-20191127-152244.1574868164.604684%2Fdax-tmp-2019-11-27_07_24_36-17060579636924315582-S02-0-e425da41c3fe2598%2Ftmp-e425da41c3fe2d8b-shard--try-33835bf582552bbd-endshard.avro&uploadType=resumable&upload_id=AEnB2UqddXXpTnnRQyxBQuL1ptXExVZ5CrUQ33o2S2UHcVUhesrBq7XFSQ90YBQznRm2Wh3g8g8lG1z5uEQv8fXvqO40z5WrnQ>: response: <{'x-guploader-uploadid': 'AEnB2UqddXXpTnnRQyxBQuL1ptXExVZ5CrUQ33o2S2UHcVUhesrBq7XFSQ90YBQznRm2Wh3g8g8lG1z5uEQv8fXvqO40z5WrnQ', 'vary': 'Origin, X-Origin', 'content-type': 'application/json; charset=UTF-8', 'content-length': '177', 'date': 'Wed, 27 Nov 2019 15:30:50 GMT', 'server': 'UploadServer', 'status': '410'}>, content <{
 "error": {
  "errors": [
   {
    "domain": "global",
    "reason": "backendError",
    "message": "Backend Error"
   }
  ],
  "code": 503,
  "message": "Backend Error"
 }
}
回溯(最近一次呼叫最后一次):
文件“/usr/local/lib/python3.7/site packages/dataflow\u-worker/batchworker.py”,第649行,在do\u-work中
工作执行器。执行器()
文件“/usr/local/lib/python3.7/site packages/dataflow_worker/executor.py”,执行中第178行
作品:完成()
文件“dataflow_worker/native_operations.py”,第93行,位于dataflow_worker.native_operations.nativeWriteOperations.finish中
文件“dataflow_worker/native_operations.py”,第94行,位于dataflow_worker.native_operations.nativeWriteOperations.finish中
文件“dataflow_worker/native_operations.py”,第95行,位于dataflow_worker.native_operations.nativeWriteOperations.finish中
文件“/usr/local/lib/python3.7/site packages/dataflow\u worker/nativeavroio.py”,第309行,在出口处__
self.\u data\u file\u writer.fo.close()
文件“/usr/local/lib/python3.7/site packages/apache_beam/io/filesystemio.py”,第217行,关闭
self._uploader.finish()
文件“/usr/local/lib/python3.7/site packages/apache_beam/io/gcp/gcsio.py”,第588行,完成
提升自我。_upload_thread.last_错误35; pylint:disable=提升错误类型
文件“/usr/local/lib/python3.7/site packages/apache\u beam/io/gcp/gcsio.py”,第565行,在“开始”上传中
self.\u client.objects.Insert(self.\u Insert\u请求,upload=self.\u upload)
文件“/usr/local/lib/python3.7/site packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py”,插入部分第1154行
上传=上传,上传配置=上传配置)
文件“/usr/local/lib/python3.7/site packages/apitools/base/py/base\u api.py”,第715行,in\u RunMethod
http_请求,client=self.client)
文件“/usr/local/lib/python3.7/site packages/apitools/base/py/transfer.py”,第908行,初始值为pload
返回self.StreamInChunks()
文件“/usr/local/lib/python3.7/site packages/apitools/base/py/transfer.py”,第1020行,StreamInChunks格式
附加_标题=附加_标题)
文件“/usr/local/lib/python3.7/site packages/apitools/base/py/transfer.py”,第971行,在\uuu StreamMedia中
self.RefreshResumableUploadState()
文件“/usr/local/lib/python3.7/site packages/apitools/base/py/transfer.py”,第875行,位于RefreshResumableUploadState中
引发异常.HttpError.FromResponse(刷新\u响应)

apitools.base.py.exceptions.HttpError:HttpError访问:响应:,内容此处为谷歌云支持。我检查了你的工作,发现两个内部问题可能与这次失败有关。正如亚历克斯·阿马托在评论中所建议的那样,我会尝试使用

--实验=使用光束/bq\u接收器

否则,我建议您直接在GCP上打开一张票据,因为这可能需要进一步调查


<>我希望有帮助。

看起来非常相似,请考虑尝试使用--Time= UsHyBeaMaqBQSink,这可能会解决这个问题。否则,您可能需要通过谷歌云支持来解决这个问题。