Apache flink 如何从Apache flink将数据写入azure blob存储?

Apache flink 如何从Apache flink将数据写入azure blob存储?,apache-flink,flink-streaming,flink-batch,Apache Flink,Flink Streaming,Flink Batch,我试图使用flink StreamingFileSink将数据从Intellij IDE写入azure blob存储,但我发现以下错误 Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. Caused by: java.lang.ClassNotFoundException: org.apache.flink.fs.azure.common.hadoop.HadoopRecove

我试图使用flink StreamingFileSink将数据从Intellij IDE写入azure blob存储,但我发现以下错误

Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
Caused by: java.lang.ClassNotFoundException: org.apache.flink.fs.azure.common.hadoop.HadoopRecoverableWriter
Caused by: java.lang.NoClassDefFoundError: org/apache/flink/fs/azure/common/hadoop/HadoopRecoverableWriter
下面是我的代码

public class BlobSample {

    public static void main(String[] args) throws Exception {

        //System.setProperty("hadoop.home.dir", "/");

        Configuration cfg = new Configuration();
        cfg.setString("fs.azure.account.key.azurekey.blob.core.windows.net",
                "azure_blob_key");
        //cfg.setBoolean("recursive.file.enumeration", true);
        FileSystem.initialize(cfg, null);

        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        DataStream<String> input = env.fromElements("hello");

        DataStream<String> output = input.flatMap(new FlatMapFunction<String, String>() {
            @Override
            public void flatMap(String value, Collector<String> out) throws Exception {
                out.collect(value.concat(" world"));
            }
        });


       // output.writeAsText("wasbs://container@myazure.blob.core.windows.net/output");

        String outputPath = "wasbs://container@azurekey.blob.core.windows.net/rawsignals";

        final StreamingFileSink<String> sink = StreamingFileSink
                .forRowFormat(new Path(outputPath), new SimpleStringEncoder<String>("UTF-8"))
                .withRollingPolicy(
                        DefaultRollingPolicy.builder()
                                .withRolloverInterval(TimeUnit.MINUTES.toMillis(15))
                                .withInactivityInterval(TimeUnit.MINUTES.toMillis(5))
                                .withMaxPartSize(100)
                                .build())
                .build();


        output.addSink(sink);

        env.execute("BlobStorage");

    }
}
public类BlobSample{
公共静态void main(字符串[]args)引发异常{
//System.setProperty(“hadoop.home.dir”、“/”;
Configuration cfg=新配置();
setString(“fs.azure.account.key.azurekey.blob.core.windows.net”,
“azure_blob_key”);
//setBoolean(“recursive.file.enumeration”,true);
初始化(cfg,null);
StreamExecutionEnvironment env=StreamExecutionEnvironment.getExecutionEnvironment();
数据流输入=env.fromElements(“hello”);
DataStream输出=input.flatMap(新的flatMap函数(){
@凌驾
公共void flatMap(字符串值,收集器输出)引发异常{
out.collect(value.concat(“世界”));
}
});
//output.writesText(“wasbs://container@myazure.blob.core.windows.net/output”);
字符串输出路径=”wasbs://container@azurekey.blob.core.windows.net/rawsignals”;
最终StreamingFileSink接收器=StreamingFileSink
.forRowFormat(新路径(outputPath)、新SimpleStringEncoder(“UTF-8”))
.使用滚动策略(
DefaultRollingPolicy.builder()
.带滚动区间(时间单位为分钟,单位为分钟(15))
.withInactivityInterval(时间单位。分钟。托米利斯(5))
.最大零件尺寸(100)
.build())
.build();
输出addSink(sink);
环境执行(“BlobStorage”);
}
}
我也尝试使用WriteEastText,但它也不起作用,我将HADOOP_HOME添加到我的环境变量中,并在build.gradle也编译组中添加此依赖项:“org.apache.flink”,名称:“flink azure fs HADOOP”,版本:“1.11.2”。我将azure密钥添加到flink-conf.yaml中,但它仍然不起作用。请帮我解决这个问题