Google cloud dataflow 数据流作业在辅助进程启动时卡住,出现容器错误
我正在运行一个非常简单的作业,并出现以下错误:Google cloud dataflow 数据流作业在辅助进程启动时卡住,出现容器错误,google-cloud-dataflow,apache-beam,Google Cloud Dataflow,Apache Beam,我正在运行一个非常简单的作业,并出现以下错误: (8a5049d0d5f7569e): Workflow failed. Causes: (8a5049d0d5f750f5): The Dataflow appears to be stuck. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support. 工作id为2018-01-15-07-42-27-12856142394489592
(8a5049d0d5f7569e): Workflow failed. Causes: (8a5049d0d5f750f5): The Dataflow appears to be stuck. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
工作id为2018-01-15-07-42-27-12856142394489592925
我用于运行此作业的代码如下所示:
(函数ReconstructConversation()进入函数体后立即返回)
kubelet登录stackdriver似乎显示存在一些容器错误:
[ContainerManager]: Fail to get rootfs information unable to find data for container /
Failed to check if disk space is available on the root partition: failed to get fs info for "root": unable to find data for container /
Failed to check if disk space is available for the runtime: failed to get fs info for "runtime": unable to find data for container /
Image garbage collection failed once. Stats initialization may not have completed yet: unable to find data for container /
任何帮助都将不胜感激,
宜庆这里有几条建议:
- 如果作业显示与容器设置相关的错误,请检查
。如果您没有发现任何明显的问题,请向云数据流支持部门提交支持通知单setup.py
- 请不要直接使用
。相反,我们建议您使用bigQueryLink
转换WriteToBigQuery
'google-cloud == 0.27.0',
'google-cloud-storage == 1.3.2',
'google-apitools == 0.5.10'
[ContainerManager]: Fail to get rootfs information unable to find data for container /
Failed to check if disk space is available on the root partition: failed to get fs info for "root": unable to find data for container /
Failed to check if disk space is available for the runtime: failed to get fs info for "runtime": unable to find data for container /
Image garbage collection failed once. Stats initialization may not have completed yet: unable to find data for container /