Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/304.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何从javapairdd中筛选记录_Java_Apache Spark - Fatal编程技术网

如何从javapairdd中筛选记录

如何从javapairdd中筛选记录,java,apache-spark,Java,Apache Spark,我正在ApacheSpark中做一个简单的单词计数示例,现在我终于得到了单词计数我只想从中过滤唯一的单词。 public class SparkClass { public static void main(String[] args) { String file = "/home/bhaumik/Documents/my"; JavaSparkContext sc = new JavaSparkContext("local", "SimpleApp"); Ja

我正在ApacheSpark中做一个简单的单词计数示例,现在我终于得到了单词计数我只想从中过滤唯一的单词。

public class SparkClass {
    public static void main(String[] args) {

    String file = "/home/bhaumik/Documents/my";
    JavaSparkContext sc = new JavaSparkContext("local", "SimpleApp");
    JavaRDD<String> lines = sc.textFile("/home/bhaumik/Documents/myText", 5)
            .flatMap(new FlatMapFunction<String, String>() {

                @Override
                public Iterable<String> call(String t) throws Exception {
                    // TODO Auto-generated method stub
                    return Arrays.asList(t.split(" "));
                }
            });

    JavaPairRDD<String, Integer> pairs = lines.mapToPair(new PairFunction<String, String, Integer>() {

        @Override
        public Tuple2<String, Integer> call(String t) throws Exception {
            // TODO Auto-generated method stub
            return new Tuple2<String, Integer>(t, 1);
        }
    });

    JavaPairRDD<String, Integer> counts = pairs.reduceByKey(new Function2<Integer, Integer, Integer>() {

        @Override
        public Integer call(Integer v1, Integer v2) throws Exception {
            // TODO Auto-generated method stub
            return v1 + v2;
        }
    });
}
公共类SparkClass{
公共静态void main(字符串[]args){
字符串文件=“/home/bhumik/Documents/my”;
JavaSparkContext sc=新的JavaSparkContext(“本地”、“SimpleApp”);
javarddlines=sc.textFile(“/home/bhumik/Documents/myText”,5)
.flatMap(新的flatMap函数(){
@凌驾
公共Iterable调用(字符串t)引发异常{
//TODO自动生成的方法存根
返回数组.asList(t.split(“”);
}
});
javapairrdpairs=lines.mapToPair(新的PairFunction(){
@凌驾
公共元组2调用(字符串t)引发异常{
//TODO自动生成的方法存根
返回新的Tuple2(t,1);
}
});
javapairdd counts=pairs.reduceByKey(新函数2(){
@凌驾
公共整数调用(整数v1、整数v2)引发异常{
//TODO自动生成的方法存根
返回v1+v2;
}
});
}
}

在计数中,您有一个带有键及其出现次数的RDD。你现在不能得到的是最小值,所以你应该减少

Tuple2<String, Integer> minApp = counts.reduce((a, b) -> (a._2 > b._2)? b : a);
Tuple2 minApp=counts.reduce((a,b)->(a.2>b.2)?b:a);
在计数中,您有一个带有键及其出现次数的RDD。你现在不能得到的是最小值,所以你应该减少

Tuple2<String, Integer> minApp = counts.reduce((a, b) -> (a._2 > b._2)? b : a);
Tuple2 minApp=counts.reduce((a,b)->(a.2>b.2)?b:a);
javapairdd uniqueIP=counts.filter(newFunction{
@凌驾
公共布尔调用(元组v1)引发异常{
返回v1._2.等于(1);
}
});
这就是我解决问题的方法…

javapairdd uniqueIP=counts.filter(newFunction{
public class SparkClass {
    public static void main(String[] args) {

    String file = "/home/bhaumik/Documents/my";
    JavaSparkContext sc = new JavaSparkContext("local", "SimpleApp");
    JavaRDD<String> lines = sc.textFile("/home/bhaumik/Documents/myText", 5)
            .flatMap(new FlatMapFunction<String, String>() {

                @Override
                public Iterable<String> call(String t) throws Exception {
                    // TODO Auto-generated method stub
                    return Arrays.asList(t.split(" "));
                }
            });

    JavaPairRDD<String, Integer> pairs = lines.mapToPair(new PairFunction<String, String, Integer>() {

        @Override
        public Tuple2<String, Integer> call(String t) throws Exception {
            // TODO Auto-generated method stub
            return new Tuple2<String, Integer>(t, 1);
        }
    });

    JavaPairRDD<String, Integer> counts = pairs.reduceByKey(new Function2<Integer, Integer, Integer>() {

        @Override
        public Integer call(Integer v1, Integer v2) throws Exception {
            // TODO Auto-generated method stub
            return v1 + v2;
        }
    });
}
@凌驾 公共布尔调用(元组v1)引发异常{ 返回v1._2.等于(1); } });

这就是我解决问题的方法…

请看,无法将其放入JavaPairdd,或者是否有任何方法将Tuple2转换为JavaPairdd??在这种情况下,这没有多大意义。另一种选择是对整个RDD进行排序并获取第一个元素实际上我在日志文件上也做了同样的操作,我从日志文件中获取IP,然后现在我想要唯一的IP,这就是为什么我要问你的原因:(感谢您的帮助和努力。不可能将其放入JavaPairdd,或者是否有任何方法将Tuple2转换为JavaPairdd??在这种情况下,这没有多大意义。其他选项可能是对整个RDD进行排序,并获取第一个元素。实际上,我正在日志文件中执行相同的操作,从中获取IP,然后现在我想要唯一的这就是我问你的原因吗?:(谢谢你的帮助和努力。
public class SparkClass {
    public static void main(String[] args) {

    String file = "/home/bhaumik/Documents/my";
    JavaSparkContext sc = new JavaSparkContext("local", "SimpleApp");
    JavaRDD<String> lines = sc.textFile("/home/bhaumik/Documents/myText", 5)
            .flatMap(new FlatMapFunction<String, String>() {

                @Override
                public Iterable<String> call(String t) throws Exception {
                    // TODO Auto-generated method stub
                    return Arrays.asList(t.split(" "));
                }
            });

    JavaPairRDD<String, Integer> pairs = lines.mapToPair(new PairFunction<String, String, Integer>() {

        @Override
        public Tuple2<String, Integer> call(String t) throws Exception {
            // TODO Auto-generated method stub
            return new Tuple2<String, Integer>(t, 1);
        }
    });

    JavaPairRDD<String, Integer> counts = pairs.reduceByKey(new Function2<Integer, Integer, Integer>() {

        @Override
        public Integer call(Integer v1, Integer v2) throws Exception {
            // TODO Auto-generated method stub
            return v1 + v2;
        }
    });
}