Java 如何遍历spark cogroup值

Java 如何遍历spark cogroup值,java,apache-spark,rdd,Java,Apache Spark,Rdd,最终输出应为 handle3 , Marketing , Soap , null , null 设法找到了解决办法 handle , Products , Makeup , Iphone , 10 handle , Health , Makeup , , Iphone, 10 handle2 , Services , Face , Samsung , 20 handle3 , Marketing, Soap , null , null javapairrdleft=ontologi

最终输出应为

handle3 , Marketing , Soap , null , null

设法找到了解决办法

handle , Products , Makeup  , Iphone , 10
handle , Health , Makeup ,  , Iphone, 10 
handle2 , Services , Face , Samsung , 20
handle3  , Marketing, Soap ,  null , null
javapairrdleft=ontologiesPair.leftOuterJoin(twitterPairRDD);
left.foreach(新的VoidFunction(){
@凌驾
公共void调用(Tuple2 arg0)引发异常{
试一试{
可选tweet=arg0._2._2();
//打印元组ie arg0._2._1()和tweet对象中的值
}   
捕获(例外e){
推特推特=新推特(“,-1);
//打印arg0._2._1()和空tweet对象中的值
}
但我仍然想知道任何使用co组的答案

handle , Products , Makeup  , Iphone , 10
handle , Health , Makeup ,  , Iphone, 10 
handle2 , Services , Face , Samsung , 20
handle3  , Marketing, Soap ,  null , null
JavaPairRDD<String, Tuple2<Ontologies, Optional<twitterPairRDD>>> left =  ontologiesPair.leftOuterJoin(twitterPairRDD);

    left.foreach(new VoidFunction<Tuple2<String,Tuple2<Ontologies,Optional<Twitter>>>>() {

        @Override
        public void call(Tuple2<String, Tuple2<Ontologies, Optional<Instagram>>> arg0) throws Exception {
            try{
                 Optional<Twitter> tweet = arg0._2._2();
                 //print values from tuple ie arg0._2._1() and tweet    object      
              }   
               catch(Exception e){
                Twitter tweet = new Twitter("",-1);
               //Print values from arg0._2._1() and empty tweet object
            }