Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ssl/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Google cloud dataflow 数据流错误-“;IOException:无法写入GCS路径..”&引用;后端错误500“;_Google Cloud Dataflow - Fatal编程技术网

Google cloud dataflow 数据流错误-“;IOException:无法写入GCS路径..”&引用;后端错误500“;

Google cloud dataflow 数据流错误-“;IOException:无法写入GCS路径..”&引用;后端错误500“;,google-cloud-dataflow,Google Cloud Dataflow,我们的一条管道抛出了以下错误。我们第一次看到它。我们在BigQuery表中运行了大约6.25亿行。作业仍已完成,并在控制台中记录为“成功”。但我们担心的是,数据流无法写入GCS的文件(数据流写入GCS,然后加载到BigQuery)可能没有加载到BigQuery,因此我们现在缺少一些数据 我们很难确定这些行是否已加载,因为我们处理的数据量很大 有没有办法知道数据流是否加载了该文件 工作编号:2015-05-27-21-8377993823053896089 2015-05-28T01:21:23.

我们的一条管道抛出了以下错误。我们第一次看到它。我们在BigQuery表中运行了大约6.25亿行。作业仍已完成,并在控制台中记录为“成功”。但我们担心的是,数据流无法写入GCS的文件(数据流写入GCS,然后加载到BigQuery)可能没有加载到BigQuery,因此我们现在缺少一些数据

我们很难确定这些行是否已加载,因为我们处理的数据量很大

有没有办法知道数据流是否加载了该文件

工作编号:2015-05-27-21-8377993823053896089

2015-05-28T01:21:23.210Z: (c1e36887ebb5e3b3): Autoscaling: Enabled for job /workflows/wf-2015-05-27_18_21_21-8377993823053896089
2015-05-28T01:22:23.711Z: (45988c062ea96b38): Autoscaling: Resizing worker pool from 1 to 3.
2015-05-28T01:23:53.713Z: (45988c062ea96352): Autoscaling: Resizing worker pool from 3 to 12.
2015-05-28T01:25:23.715Z: (45988c062ea96b6c): Autoscaling: Resizing worker pool from 12 to 48.
2015-05-28T01:26:53.716Z: (45988c062ea96386): Autoscaling: Resizing worker pool from 48 to 64.
2015-05-28T01:48:48.863Z: (54b9f9ed2402c4e7): java.io.IOException: Failed to write to GCS path gs://<removed>/15697574167464387868/dax-tmp-2015-05-27_18_21_21-8377993823053896089-S09-1-731cba632206348a/-shard-00000-of-00001_C183_00000-of-00001-try-52ba464032d439ee-endshard.json.
    at com.google.cloud.dataflow.sdk.util.gcsio.GoogleCloudStorageWriteChannel.throwIfUploadFailed(GoogleCloudStorageWriteChannel.java:372)
    at com.google.cloud.dataflow.sdk.util.gcsio.GoogleCloudStorageWriteChannel.close(GoogleCloudStorageWriteChannel.java:270)
    at com.google.cloud.dataflow.sdk.runners.worker.TextSink$TextFileWriter.close(TextSink.java:243)
    at com.google.cloud.dataflow.sdk.util.common.worker.WriteOperation.finish(WriteOperation.java:100)
    at com.google.cloud.dataflow.sdk.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:74)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.doWork(DataflowWorker.java:130)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:95)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:139)
    at com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:124)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 410 Gone
{
  "code" : 500,
  "errors" : [ {
    "domain" : "global",
    "message" : "Backend Error",
    "reason" : "backendError"
  } ],
  "message" : "Backend Error"
}
    at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145)
    at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    at com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
    at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
    at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
    at com.google.cloud.dataflow.sdk.util.gcsio.GoogleCloudStorageWriteChannel$UploadOperation.run(GoogleCloudStorageWriteChannel.java:166)
    ... 3 more

2015-05-28T01:48:53.870Z: (4aaf52256f502f1a): Failed task is going to be retried.
2015-05-28T02:00:49.444Z: S09: (aafd22d37feb496e): Unable to delete temporary files gs://<removed>/15697574167464387868/dax-tmp-2015-05-27_18_21_21-8377993823053896089-S09-1-731cba632206348a/@DAX.json$ Causes: (aafd22d37feb4227): Unable to delete directory: gs://<removed>/15697574167464387868/dax-tmp-2015-05-27_18_21_21-8377993823053896089-S09-1-731cba632206348a.
2015-05-28T01:21:23.210Z:(c1e36887ebb5e3b3):自动缩放:为作业/工作流启用/wf-2015-05-27_18_21_21-8377993823053896089
2015-05-28T01:22:23.711Z:(45988c062ea96b38):自动缩放:将工作池大小从1调整为3。
2015-05-28T01:23:53.713Z:(45988c062ea96352):自动缩放:将工作人员池大小从3调整为12。
2015-05-28T01:25:23.715Z:(45988c062ea96b6c):自动缩放:将工作人员池大小从12调整为48。
2015-05-28T01:26:53.716Z:(45988c062ea96386):自动缩放:将工作池大小从48调整为64。
2015-05-28T01:48:48.863Z:(54b9f9ed2402c4e7):java.io.IOException:未能写入GCS路径gs:///1569774167464387868/dax-tmp-2015-05-27_18_21_21-8377993823053896089-S09-1-731cba632206348a/-shard-00000-of-00001_-C183-of-00001-try-52BA464032D39EE-endshard.json。
位于com.google.cloud.dataflow.sdk.util.gcsio.GoogleCloudStorageWriteChannel.throwIfUploadFailed(GoogleCloudStorageWriteChannel.java:372)
位于com.google.cloud.dataflow.sdk.util.gcsio.GoogleCloudStorageWriteChannel.close(GoogleCloudStorageWriteChannel.java:270)
位于com.google.cloud.dataflow.sdk.runners.worker.TextSink$TextFileWriter.close(TextSink.java:243)
在com.google.cloud.dataflow.sdk.util.common.worker.WriteOperation.finish(WriteOperation.java:100)上
位于com.google.cloud.dataflow.sdk.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:74)
位于com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.doWork(DataflowWorker.java:130)
位于com.google.cloud.dataflow.sdk.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:95)
在com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:139)上
位于com.google.cloud.dataflow.sdk.runners.worker.DataflowWorkerHarness$WorkerThread.call(DataflowWorkerHarness.java:124)
在java.util.concurrent.FutureTask.run(FutureTask.java:266)处
位于java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
运行(Thread.java:745)
原因:com.google.api.client.googleapis.json.googlejson响应异常:410消失
{
“代码”:500,
“错误”:[{
“域”:“全局”,
“消息”:“后端错误”,
“原因”:“backendError”
} ],
“消息”:“后端错误”
}
位于com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:145)
位于com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.NewExceptionError(AbstractGoogleJsonClientRequest.java:113)
位于com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.NewExceptionError(AbstractGoogleJsonClientRequest.java:40)
位于com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
位于com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
位于com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
在com.google.cloud.dataflow.sdk.util.gcsio.GoogleCloudStorageWriteChannel$UploadOperation.run(GoogleCloudStorageWriteChannel.java:166)上
... 3个以上
2015-05-28T01:48:53.870Z:(4aaf52256f502f1a):将重试失败的任务。
2015-05-28602:00:49.444Z:S09:(aafd22d37feb496e):无法删除临时文件gs://1569774167464387868/dax-tmp-2015-05-27_18_21-8377993823053896089-S09-1-731cba632206348a://dax.json$原因:(aafd22d37feb4227):无法删除目录:gs://1569774167464387868/dax-tmp-2015-05-27_18_18-S09-1-731CBA63220638;。

数据流重试失败的任务(最多4次)。在本例中,错误似乎是暂时的,任务在重试时成功。你的数据应该是完整的

数据流重试失败的任务(最多4次)。在本例中,错误似乎是暂时的,任务在重试时成功。你的数据应该是完整的