Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark spark jobserver无法使用spark 2.0构建_Apache Spark_Sbt_Spark Jobserver - Fatal编程技术网

Apache spark spark jobserver无法使用spark 2.0构建

Apache spark spark jobserver无法使用spark 2.0构建,apache-spark,sbt,spark-jobserver,Apache Spark,Sbt,Spark Jobserver,我正在尝试使用spark-2.0运行spark jobserver 我从github存储库克隆了branch spark-2.0-preview。我遵循部署指南,但当我尝试使用bin/server\u deploy.sh部署服务器时。我发现编译错误: Error: [error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:4: cannot find symb

我正在尝试使用spark-2.0运行spark jobserver 我从github存储库克隆了branch spark-2.0-preview。我遵循部署指南,但当我尝试使用bin/server\u deploy.sh部署服务器时。我发现编译错误:

 Error:
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:4: cannot find symbol
[error] symbol: class DataFrame
[error] location: package org.apache.spark.sql
[error] import org.apache.spark.sql.DataFrame;
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java:13: java.lang.Object cannot be converted to org.apache.spark.sql.Row[]
[error] return sc.sql(data.getString("sql")).collect();
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:25: cannot find symbol
[error] symbol: class DataFrame
[error] location: class spark.jobserver.JHiveTestLoaderJob
[error] final DataFrame addrRdd = sc.sql("SELECT * FROM default.test_addresses");
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JSqlTestJob.java:13: array required, but java.lang.Object found
[error] Row row = sc.sql("select 1+1").take(1)[0];
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Some input files use or override a deprecated API.
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Recompile with -Xlint:deprecation for details.
[error] (job-server-extras/compile:compileIncremental) javac returned nonzero exit code

我忘记添加一些依赖项了吗

我也有类似的问题。我发现这是一个bug,因为SparkAPI从1.x到2.x发生了变化。您可以在github上找到未解决的问题

我引入了一些快速修复,解决了这个问题,我可以部署jobserver。我为此提交了请求