Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 如何将JavaInputDStream JSON转换为ElasticSearch JAVA_Apache Spark_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Apache Kafka_Spark Streaming - Fatal编程技术网 elasticsearch,apache-kafka,spark-streaming,Apache Spark,elasticsearch,Apache Kafka,Spark Streaming" /> elasticsearch,apache-kafka,spark-streaming,Apache Spark,elasticsearch,Apache Kafka,Spark Streaming" />

Apache spark 如何将JavaInputDStream JSON转换为ElasticSearch JAVA

Apache spark 如何将JavaInputDStream JSON转换为ElasticSearch JAVA,apache-spark,elasticsearch,apache-kafka,spark-streaming,Apache Spark,elasticsearch,Apache Kafka,Spark Streaming,大家好,我正在与卡夫卡>火花流>Elasticsearch合作。 但我不会将spark streaming JavaInputDStream JSON转换为elasticsearch 我的代码: SparkConf conf = new SparkConf() .setAppName("Streaming") .setMaster("local") .set("es.nodes","localhost:9200")

大家好,我正在与卡夫卡>火花流>Elasticsearch合作。 但我不会将spark streaming JavaInputDStream JSON转换为elasticsearch

我的代码:

    SparkConf conf = new SparkConf()
            .setAppName("Streaming")
            .setMaster("local")
            .set("es.nodes","localhost:9200")
            .set("es.index.auto.create","true");
    JavaStreamingContext streamingContext = new JavaStreamingContext(conf, new Duration(5000));
    Map<String, Object> kafkaParams = new HashMap<>();
    kafkaParams.put("bootstrap.servers", "localhost:9092");
    kafkaParams.put("key.deserializer", StringDeserializer.class);
    kafkaParams.put("value.deserializer", StringDeserializer.class);
    kafkaParams.put("group.id", "exastax");
    kafkaParams.put("auto.offset.reset", "latest");
    kafkaParams.put("enable.auto.commit", false);

    Collection<String> topics = Arrays.asList("loglar");
    JavaInputDStream<ConsumerRecord<String, String>> stream =
            KafkaUtils.createDirectStream(
                    streamingContext,
                    LocationStrategies.PreferConsistent(),
                    ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)
            );

    JavaPairDStream<String, String> finisStream = stream.mapToPair(record -> new Tuple2<>("", record.value()));
    finisStream.print();
    JavaEsSparkStreaming.saveJsonToEs(finisStream,"spark/docs");
    streamingContext.start();
    streamingContext.awaitTermination();


}
SparkConf conf=new SparkConf()
.setAppName(“流媒体”)
.setMaster(“本地”)
.set(“es.nodes”,“localhost:9200”)
.set(“es.index.auto.create”、“true”);
JavaStreamingContext streamingContext=新的JavaStreamingContext(conf,新的持续时间(5000));
Map kafkaParams=新HashMap();
kafkaParams.put(“bootstrap.servers”,“localhost:9092”);
kafkaParams.put(“key.deserializer”,StringDeserializer.class);
kafkaParams.put(“value.deserializer”,StringDeserializer.class);
kafkaParams.put(“group.id”、“exastax”);
kafkaParams.put(“自动偏移重置”、“最新”);
kafkaParams.put(“enable.auto.commit”,false);
集合主题=Arrays.asList(“loglar”);
JavaInputDStream流=
KafkaUtils.createDirectStream(
流线型背景,
LocationStrategies.PreferConsistent(),
订阅(主题,卡夫卡帕拉)
);
JavaPairDStream finisStream=stream.mapToPair(记录->新元组2(“,记录.value());
finishstream.print();
saveJsonToEs(finisStream,“spark/docs”);
streamingContext.start();
streamingContext.waitingTermination();
}
JavaEsSparkStreaming.saveJsonToEs(finisStream,“spark/docs”);>>finisStream无法工作,因为它不是JavaDStream。
如何转换JavaDStream?

JavaEsSparkStreaming.saveJsonotes
JavaDStream一起使用


JavaEsSparkStreaming.saveToEsWithMeta
JavaPairDStream

要修复代码,请执行以下操作:

JavaDStream<String> finisStream = stream.map(new Function<Tuple2<String, String>, String>() {
    public String call(Tuple2<String, String> stringStringTuple2) throws Exception {
        return stringStringTuple2._2();
    }
});

JavaEsSparkStreaming.saveJsonToEs(finisStream,"");
JavaDStream finisStream=stream.map(新函数(){
公共字符串调用(Tuple2 StringTuple2)引发异常{
返回stringtuple2._2();
}
});
JavaEsSparkStreaming.saveJsonToEs(finisStream,“”);

JavaEsSparkStreaming.saveJsonToEs
JavaDStream


JavaEsSparkStreaming.saveToEsWithMeta
JavaPairDStream

要修复代码,请执行以下操作:

JavaDStream<String> finisStream = stream.map(new Function<Tuple2<String, String>, String>() {
    public String call(Tuple2<String, String> stringStringTuple2) throws Exception {
        return stringStringTuple2._2();
    }
});

JavaEsSparkStreaming.saveJsonToEs(finisStream,"");
JavaDStream finisStream=stream.map(新函数(){
公共字符串调用(Tuple2 StringTuple2)引发异常{
返回stringtuple2._2();
}
});
JavaEsSparkStreaming.saveJsonToEs(finisStream,“”);

答案太多了!但我解决了这个问题:

 JavaDStream<String> stream1 = stream.map(
                new Function<ConsumerRecord<String, String>, String>() {
                    @Override
                    public String call(ConsumerRecord<String, String> r) {
                        return r.value();
                    }
                }
        );
           JavaEsSparkStreaming.saveJsonToEs(stream1,"spark/docs");
JavaDStream stream1=stream.map(
新函数(){
@凌驾
公共字符串调用(用户记录){
返回r.value();
}
}
);
saveJsonToEs(stream1,“spark/docs”);

答案太多了!但我解决了这个问题:

 JavaDStream<String> stream1 = stream.map(
                new Function<ConsumerRecord<String, String>, String>() {
                    @Override
                    public String call(ConsumerRecord<String, String> r) {
                        return r.value();
                    }
                }
        );
           JavaEsSparkStreaming.saveJsonToEs(stream1,"spark/docs");
JavaDStream stream1=stream.map(
新函数(){
@凌驾
公共字符串调用(用户记录){
返回r.value();
}
}
);
saveJsonToEs(stream1,“spark/docs”);