Google cloud dataflow 数据流作业在辅助进程启动时卡住,出现容器错误

Google cloud dataflow 数据流作业在辅助进程启动时卡住,出现容器错误,google-cloud-dataflow,apache-beam,Google Cloud Dataflow,Apache Beam,我正在运行一个非常简单的作业,并出现以下错误: (8a5049d0d5f7569e): Workflow failed. Causes: (8a5049d0d5f750f5): The Dataflow appears to be stuck. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support. 工作id为2018-01-15-07-42-27-12856142394489592

我正在运行一个非常简单的作业,并出现以下错误:

(8a5049d0d5f7569e): Workflow failed. Causes: (8a5049d0d5f750f5): The Dataflow appears to be stuck. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support. 
工作id为2018-01-15-07-42-27-12856142394489592925

我用于运行此作业的代码如下所示: (函数ReconstructConversation()进入函数体后立即返回)

kubelet登录stackdriver似乎显示存在一些容器错误:

[ContainerManager]: Fail to get rootfs information unable to find data for container / 
Failed to check if disk space is available on the root partition: failed to get fs info for "root": unable to find data for container / 
Failed to check if disk space is available for the runtime: failed to get fs info for "runtime": unable to find data for container / 
Image garbage collection failed once. Stats initialization may not have completed yet: unable to find data for container / 
任何帮助都将不胜感激,
宜庆

这里有几条建议:

  • 如果作业显示与容器设置相关的错误,请检查
    setup.py
    。如果您没有发现任何明显的问题,请向云数据流支持部门提交支持通知单
  • 请不要直接使用
    bigQueryLink
    。相反,我们建议您使用
    WriteToBigQuery
    转换

提供更多详细信息这是一个三步工作流程:基本上从bigquery读取数据,使用ParDo处理数据并将其写回。bigquery导出作业已成功完成,但ParDo作业似乎在启动后立即卡住:这似乎是stackdriver:getPodContainerStatuses for pod“dataflow-reconstruction-test-01131953-422a-harness-q133_default(5553b5c369dde909da7e117a5c218182)”中的主要错误失败:rpc错误:代码=2描述=错误:没有此类容器:1C2CE70ED4A8B65AEB2E90611007DCF86C2BC09DD672FB894C41536B2460FLOGS也显示找不到容器。自动缩放可能会触发此故障。“区域us-central1-f中的工作池启动达到6个工作人员,但目标是30个工作人员。服务将重试。配额超过:配额“in_USE_ADDRESSES”超过。限制:区域us-central1中的100.0”。我简化了工作流并停止了项目中的所有其他数据流作业,然后自动缩放问题消失,但是问题依然存在。您好,通过鼓励使用ReadFromBigQuery转换,您指的是什么?在apache beam的源代码中没有这样一个名称的转换。呃,我很抱歉:(我想我弄糊涂了。
'google-cloud == 0.27.0',
'google-cloud-storage == 1.3.2',
'google-apitools == 0.5.10'
[ContainerManager]: Fail to get rootfs information unable to find data for container / 
Failed to check if disk space is available on the root partition: failed to get fs info for "root": unable to find data for container / 
Failed to check if disk space is available for the runtime: failed to get fs info for "runtime": unable to find data for container / 
Image garbage collection failed once. Stats initialization may not have completed yet: unable to find data for container /