Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/391.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache Spark-JavaSparkContext无法转换为SparkContext错误_Java_Apache Spark - Fatal编程技术网

Apache Spark-JavaSparkContext无法转换为SparkContext错误

Apache Spark-JavaSparkContext无法转换为SparkContext错误,java,apache-spark,Java,Apache Spark,我在将Spark示例转换为可运行代码时遇到了相当大的困难(如我前面的问题所证明的) 这里提供的答案帮助我解决了这个特殊的例子,但现在我正在尝试,并且马上就遇到了错误 import org.apache.spark.SparkConf; 导入org.apache.spark.api.java.*; 导入org.apache.spark.api.java.function.function; 导入org.apache.spark.api.java.JavaRDD; 导入org.apache.spar

我在将Spark示例转换为可运行代码时遇到了相当大的困难(如我前面的问题所证明的)

这里提供的答案帮助我解决了这个特殊的例子,但现在我正在尝试,并且马上就遇到了错误

import org.apache.spark.SparkConf;
导入org.apache.spark.api.java.*;
导入org.apache.spark.api.java.function.function;
导入org.apache.spark.api.java.JavaRDD;
导入org.apache.spark.api.java.JavaSparkContext;
导入org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel;
导入org.apache.spark.ml.classification.MultilayerPerceptronClassifier;
导入org.apache.spark.ml.evaluation.MultiClassificationEvaluator;
导入org.apache.spark.ml.param.ParamMap;
导入org.apache.spark.mllib.regression.LabeledPoint;
导入org.apache.spark.mllib.util.MLUtils;
导入org.apache.spark.mllib.linalg.Vectors;
导入org.apache.spark.sql.DataFrame;
导入org.apache.spark.sql.Row;
导入org.apache.spark.sql.SQLContext;
//负荷训练数据
公共类SimpleANN{
公共静态void main(字符串[]args){
String path=“文件:/usr/local/share/spark-1.5.0/data/mllib/sample\u multiclass\u classification\u data.txt”;
SparkConf conf=new SparkConf().setAppName(“简单ANN”);
JavaSparkContext sc=新的JavaSparkContext(conf);
JavaRDD data=MLUtils.loadLibSVMFile(sc,path).toJavaRDD();
...
...
}
}
我得到以下错误

[ERROR]无法在project simple ann上执行目标org.apache.maven.plugins:maven编译器插件:3.1:compile(默认编译):编译失败
[错误]/Users/robo/study/spark/ann/src/main/java/SimpleANN.java:[23,61]不兼容的类型:org.apache.spark.api.java.JavaSparkContext无法转换为org.apache.spark.SparkContext

如果需要JavaSparkContext中的SparkContext,可以使用静态方法:

JavaSparkContext.toSparkContext(youJavaSparkContextBean)
因此,您必须从中修改代码

JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc, path).toJavaRDD();
JavaSparkContext sc=新的JavaSparkContext(conf);
JavaRDD data=MLUtils.loadLibSVMFile(sc,path).toJavaRDD();

JavaSparkContext sc=新的JavaSparkContext(conf);
JavaRDD data=MLUtils.loadLibSVMFile(
JavaSparkContext.toSparkContext(sc),
toJavaRDD();

如果需要JavaSparkContext中的SparkContext,可以使用静态方法:

JavaSparkContext.toSparkContext(youJavaSparkContextBean)
因此,您必须从中修改代码

JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc, path).toJavaRDD();
JavaSparkContext sc=新的JavaSparkContext(conf);
JavaRDD data=MLUtils.loadLibSVMFile(sc,path).toJavaRDD();

JavaSparkContext sc=新的JavaSparkContext(conf);
JavaRDD data=MLUtils.loadLibSVMFile(
JavaSparkContext.toSparkContext(sc),
toJavaRDD();