Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/333.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/apache/9.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 在intellij idea中导入spark依赖项时出错_Java_Apache_Maven_Intellij Idea - Fatal编程技术网

Java 在intellij idea中导入spark依赖项时出错

Java 在intellij idea中导入spark依赖项时出错,java,apache,maven,intellij-idea,Java,Apache,Maven,Intellij Idea,我将intelli j idea与maven集成在一起,但我在以下几行中遇到了错误 import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaRDD; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.api.java.function.Function; 我试图运行以下示例 package com.sp

我将intelli j idea与maven集成在一起,但我在以下几行中遇到了错误

import org.apache.spark.SparkConf;

import org.apache.spark.api.java.JavaRDD;

import org.apache.spark.api.java.JavaSparkContext;

import org.apache.spark.api.java.function.Function;
我试图运行以下示例

package com.spark.hello;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;

public class Hello {

      public static void main(String[] args) {
            String logFile = "F:\\Spark\\a.java"; 
            SparkConf conf = new SparkConf().setAppName("Simple Application");
            JavaSparkContext sc = new JavaSparkContext(conf);
            JavaRDD<String> logData = sc.textFile(logFile).cache();

            long numAs = logData.filter(new Function<String, Boolean>() {
              public Boolean call(String s) { return s.contains("a"); }
            }).count();

            long numBs = logData.filter(new Function<String, Boolean>() {
              public Boolean call(String s) { return s.contains("b"); }
            }).count();

            System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);
          }






}
package com.spark.hello;
导入org.apache.spark.SparkConf;
导入org.apache.spark.api.java.JavaRDD;
导入org.apache.spark.api.java.JavaSparkContext;
导入org.apache.spark.api.java.function.function;
公共课你好{
公共静态void main(字符串[]args){
字符串logFile=“F:\\Spark\\a.java”;
SparkConf conf=new SparkConf().setAppName(“简单应用程序”);
JavaSparkContext sc=新的JavaSparkContext(conf);
JavaRDD logData=sc.textFile(logFile.cache();
long numAs=logData.filter(新函数(){
公共布尔调用(字符串s){返回s.contains(“a”);}
}).count();
long numBs=logData.filter(新函数(){
公共布尔调用(字符串s){返回s.contains(“b”);}
}).count();
System.out.println(“带a的行:“+numAs+”,带b的行:“+numBs”);
}
}

请帮助我解决这个问题,或者是否有其他方法来运行此类项目???

在没有看到错误的情况下,我猜IDE会告诉您它们是未使用的导入。请确保仔细检查依赖项和版本

Alt+Enter是我用来解决许多问题的快捷方式