Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 编译错误Spark 1.3.1_Hadoop_Apache Spark - Fatal编程技术网

Hadoop 编译错误Spark 1.3.1

Hadoop 编译错误Spark 1.3.1,hadoop,apache-spark,Hadoop,Apache Spark,我尝试使用以下标志编译Spark 1.3.1 mvn-Pyarn-Phadoop-2.4-Dhadoop.version=2.6.0\ -Dscala-2.11\ -Phive-Phive-0.13.1-Phive节约服务器\ -DskipTests清洁包装 编译失败,出现以下错误。 有什么建议吗 谢谢 [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCE

我尝试使用以下标志编译Spark 1.3.1 mvn-Pyarn-Phadoop-2.4-Dhadoop.version=2.6.0\ -Dscala-2.11\ -Phive-Phive-0.13.1-Phive节约服务器\ -DskipTests清洁包装 编译失败,出现以下错误。 有什么建议吗

谢谢

[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS       [01:08 min]
[INFO] Spark Project Core ................................. SUCCESS [02:38 min]
[INFO] Spark Project Bagel ................................ SUCCES  [ 17.700 s]
[INFO] Spark Project GraphX ............................... SUCCESS [ 35.732 s]
[INFO] Spark Project ML Library ........................... SUCCESS [01:11 min]
[INFO] Spark Project Tools ................................ SUCCESS [  6.718 s]
[INFO] Spark Project Networking ........................... SUCCESS [  6.837 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  3.534 s]
[INFO] Spark Project Streaming ............................ SUCCESS [ 43.771 s]
[INFO] Spark Project Catalyst ............................. SUCCESS [ 48.411 s]
[INFO] Spark Project SQL .................................. SUCCESS [ 56.046 s]
[INFO] Spark Project Hive ................................. SUCCESS `enter code here`[01:01 min]
[INFO] Spark Project Assembly ............................. FAILURE [  6.365 s]
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 09:45 min
[INFO] Finished at: 2015-05-15T11:25:53-06:00
[INFO] Final Memory: 77M/1176M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project spark-assembly_2.10: Could not resolve dependencies for project org.apache.spark:spark-assembly_2.10:pom:1.3.1: Could not find artifact org.apache.spark:spark-hive-thriftserver_2.11:jar:1.3.1 in central (https://repo1.maven.org/maven2) -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException:未能在project spark-assembly_2.10上执行目标:无法解析project org.apache的依赖项。spark:spark-assembly_2.10:pom:1.3.1:无法在central()中找到工件org.apache.spark:spark-hive-thriftserver_2.11:jar:1.3.1

maven repository中似乎不存在spark-hive-thriftserver_2.11的依赖项,我也在maven repo中手动搜索了它,但什么也没有找到。我认为thrift server目前还没有为scala2.11做好准备。

谢谢!我切换到maven 2.10.5,编译进行得很顺利。