Google cloud dataflow apache beam 2.2 pipeline.apply无此类方法异常

Google cloud dataflow apache beam 2.2 pipeline.apply无此类方法异常,google-cloud-dataflow,apache-beam,Google Cloud Dataflow,Apache Beam,pom.xml public static void main(String[] args) { //Pipeline p = Pipeline.create(PipelineOptionsFactory.fromArgs(args).withValidation().create()); DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipeline

pom.xml

    public static void main(String[] args) {
            //Pipeline p = Pipeline.create(PipelineOptionsFactory.fromArgs(args).withValidation().create());
            DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
            options.setRunner(DataflowRunner.class);
            options.setStagingLocation("gs://bucketname/stageapache");
            options.setTempLocation("gs://bucketname/stageapachetemp");
            options.setProject("projectid");
            Pipeline p=Pipeline.create(options);
    p.apply(TextIO.read().from("gs://bucketname/filename.csv"));
//p.apply(FileIO.match().filepattern("gs://bucketname/f.csv"));
            p.run();
        }
在上面的代码中,如果添加FileIO/TextIO行,我会得到上面的错误,如果我运行它,那么不添加该行就是在创建作业,因为其中没有任何操作是失败的。在我的开发过程中,为了控制从存储中读取的文件,我迁移到了ApacheBeam2.2

我们将不胜感激


谢谢

问题在于您的
pom.xml
取决于不同版本的Beam SDK的不同组件:
Beam SDK java core
2.2.0版本,但
Beam SDK java io google cloud platform
Beam runners google cloud Data Flow java
2.0.0版本。它们需要在同一版本。

问题在于,您的
pom.xml
取决于不同版本的Beam SDK的不同组件:
Beam SDK java core
2.2.0版本,但
Beam SDK java io google cloud platform
Beam runners google cloud dataflow java
2.0版本。它们必须在同一版本

<dependency>
  <groupId>org.apache.beam</groupId>
  <artifactId>beam-sdks-java-core</artifactId>
  <version>2.2.0</version>
</dependency>
<dependency>
    <groupId>org.apache.beam</groupId>
    <artifactId>beam-sdks-java-io-google-cloud-platform</artifactId>
    <version>2.0.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.beam/beam-runners-google-cloud-dataflow-java -->
<dependency>
    <groupId>org.apache.beam</groupId>
    <artifactId>beam-runners-google-cloud-dataflow-java</artifactId>
    <version>2.0.0</version>
</dependency>
Dec 08, 2017 5:09:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 85 files. Enable logging at DEBUG level to see which files will be staged.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.beam.sdk.values.PCollection.createPrimitiveOutputInternal(Lorg/apache/beam/sdk/Pipeline;Lorg/apache/beam/sdk/values/WindowingStrategy;Lorg/apache/beam/sdk/values/PCollection$IsBounded;)Lorg/apache/beam/sdk/values/PCollection;
    at org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory$ParDoSingle.expand(PrimitiveParDoSingleFactory.java:68)
    at org.apache.beam.runners.dataflow.PrimitiveParDoSingleFactory$ParDoSingle.expand(PrimitiveParDoSingleFactory.java:58)
    at org.apache.beam.sdk.Pipeline.applyReplacement(Pipeline.java:550)
    at org.apache.beam.sdk.Pipeline.replace(Pipeline.java:280)
    at org.apache.beam.sdk.Pipeline.replaceAll(Pipeline.java:201)
    at org.apache.beam.runners.dataflow.DataflowRunner.replaceTransforms(DataflowRunner.java:688)
    at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:498)
    at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:153)
    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:303)
    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:289)
    at com.pearson.apachebeam.StarterPipeline.main(StarterPipeline.java:60)