Google cloud dataflow ImportError:没有名为options.value\u的模块提供程序

Google cloud dataflow ImportError:没有名为options.value\u的模块提供程序,google-cloud-dataflow,apache-beam,Google Cloud Dataflow,Apache Beam,以下管道与DirectRunner一起工作,但与DataflowRunner一起引发以下异常。 如何调试此类错误?这对我来说似乎是相当不透明的 p = beam.Pipeline("DataflowRunner", argv=[ '--project', project, '--staging_location', staging_location, '--temp_location', temp_location, '--output', output_gcs

以下管道与DirectRunner一起工作,但与DataflowRunner一起引发以下异常。 如何调试此类错误?这对我来说似乎是相当不透明的

p = beam.Pipeline("DataflowRunner", argv=[
    '--project', project,
    '--staging_location', staging_location,
    '--temp_location', temp_location,
    '--output', output_gcs
])  
(p  
 | 'read events' >> beam.io.Read(beam.io.BigQuerySource(query=query, use_standard_sql=True))
 | 'write' >> beam.io.WriteToText(output_gcs)
)   
p.run().wait_until_finish()
提高

File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 578, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 165, in execute
    op.start()
  File "dataflow_worker/operations.py", line 350, in dataflow_worker.operations.DoOperation.start (dataflow_worker/operations.c:13064)
    def start(self):
  File "dataflow_worker/operations.py", line 351, in dataflow_worker.operations.DoOperation.start (dataflow_worker/operations.c:12958)
    with self.scoped_start_state:
  File "dataflow_worker/operations.py", line 356, in dataflow_worker.operations.DoOperation.start (dataflow_worker/operations.c:12159)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1090, in load_global
    klass = self.find_class(module, name)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 423, in find_class
    return StockUnpickler.find_class(self, module, name)
  File "/usr/lib/python2.7/pickle.py", line 1124, in find_class
    __import__(module)
ImportError: No module named options.value_provider

value_provider是最近引入的一个模块,用于处理python SDK中的模板。但是,我在您的代码片段中没有看到任何模板,因此可能是包不匹配。您是否为SDK和worker使用匹配的版本?您可以查看worker启动日志以检查已安装的软件包的版本。

这里也有相同的问题。正如Maria所指出的,apache_beam和google云数据流包之间存在不匹配问题

为了明确起见,以下命令将解决此问题:

pip2 install --upgrade apache_beam google-cloud-dataflow

您使用的sdk版本是什么?您是对的。问题是SDK不匹配。在本地,我安装了apache_beam和google云数据流“pip”。卸载apache_beam解决了此问题。