Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop Spark 1.2.1未能编制装配项目_Hadoop_Apache Spark_Hive_Apache Spark 1.2 - Fatal编程技术网

Hadoop Spark 1.2.1未能编制装配项目

Hadoop Spark 1.2.1未能编制装配项目,hadoop,apache-spark,hive,apache-spark-1.2,Hadoop,Apache Spark,Hive,Apache Spark 1.2,刚下载Spark 1.2.1,在组装项目中编译失败,错误如下: The requested profile "hadoop-2.6" could not be activated because it does not exist. [ERROR] Failed to execute goal on project spark-assembly_2.10: Could not resolve dependencies for project org.apache.spark:spark-asse

刚下载Spark 1.2.1,在组装项目中编译失败,错误如下:

The requested profile "hadoop-2.6" could not be activated because it does not exist.
[ERROR] Failed to execute goal on project spark-assembly_2.10: Could not resolve dependencies for project org.apache.spark:spark-assembly_2.10:pom:1.2.1: Failure to find org.apache.spark:spark-hive-thriftserver_2.11:jar:1.2.1
以下是环境:

  • Hadoop 2.6.0--通过brew安装
  • Hive 0.14.0--通过brew安装
  • Spark 1.2.1作为tgz下载,因为Brew抱怨直线是一种常见的二进制文件
  • Scala 2.11--通过brew安装
  • sbt 0.13.7--通过brew安装
  • 我使用以下参数编译spark分布: mvn-Pyarn-Phadoop-2.6-Dhadoop.version=2.6.0-Phive-Phive thriftserver-Dscala-2.11-DskipTests干净包

    Reactor Summary:
    [INFO] 
    [INFO] Spark Project Parent POM .......................... SUCCESS [  3.525 s]
    [INFO] Spark Project Core ................................ SUCCESS [02:56 min]
    [INFO] Spark Project Bagel ............................... SUCCESS [ 17.102 s]
    [INFO] Spark Project GraphX .............................. SUCCESS [ 45.246 s]
    [INFO] Spark Project ML Library .......................... SUCCESS [01:22 min]
    [INFO] Spark Project Tools ............................... SUCCESS [ 11.457 s]
    [INFO] Spark Project Networking .......................... SUCCESS [  6.121 s]
    [INFO] Spark Project Shuffle Streaming Service ........... SUCCESS [  5.642 s]
    [INFO] Spark Project Streaming ........................... SUCCESS [01:19 min]
    [INFO] Spark Project Catalyst ............................ SUCCESS [01:27 min]
    [INFO] Spark Project SQL ................................. SUCCESS [01:19 min]
    [INFO] Spark Project Hive ................................ SUCCESS [01:20 min]
    [INFO] Spark Project Assembly ............................ FAILURE [  0.396 s]
    [INFO] Spark Project External Twitter .................... SKIPPED
    [INFO] Spark Project External Flume ...................... SKIPPED
    [INFO] Spark Project External Flume Sink ................. SKIPPED
    [INFO] Spark Project External MQTT ....................... SKIPPED
    [INFO] Spark Project External ZeroMQ ..................... SKIPPED
    [INFO] Spark Project Examples ............................ SKIPPED
    [INFO] Spark Project REPL ................................ SKIPPED
    [INFO] Spark Project YARN Parent POM ..................... SKIPPED
    [INFO] Spark Project YARN Stable API ..................... SKIPPED
    [INFO] Spark Project YARN Shuffle Service ................ SKIPPED
    [INFO] Spark Project Hive Thrift Server .................. SKIPPED
    
    我错过什么了吗?我不想使用brew安装Apache Spark,因为我将不得不取消hive的链接,我也想使用它


    谢谢

    尝试使用
    hadoop-2.4
    配置文件,但将其他hadoop版本保留为2.6.0:

    mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -Dscala-2.11 -DskipTests clean package
    

    来源:

    谢谢,但这不起作用。同样的错误,问题似乎是缺少依赖项。未能找到org.apache.spark:spark-hive-thriftserver_2.11:jar:1.2.1