Apache spark 如何从源代码正确构建spark 2.0,以包括pyspark?

Apache spark 如何从源代码正确构建spark 2.0,以包括pyspark?,apache-spark,pyspark,Apache Spark,Pyspark,我刚刚在Ubuntu主机上使用“sbt汇编”构建了spark 2.0。 一切都很顺利,但当我试图提交pyspark作业时: bin/spark-submit --master spark://localhost:7077 examples/src/main/python/pi.py 1000 我得到了这个错误: Failed to find Spark jars directory (/home/ubuntu/spark/spark-2.0.0/assembly/target/scala-2.

我刚刚在Ubuntu主机上使用“sbt汇编”构建了spark 2.0。 一切都很顺利,但当我试图提交pyspark作业时:

bin/spark-submit --master spark://localhost:7077 examples/src/main/python/pi.py 1000
我得到了这个错误:

Failed to find Spark jars directory (/home/ubuntu/spark/spark-2.0.0/assembly/target/scala-2.10/jars).
You need to build Spark with the target "package" before running this program.
要重新构建spark 2.0以包含pyspark,我应该做些什么?

尝试:

  • 建造:

    https://github.com/apache/spark.git
    cd spark
    git checkout v2.0.0
    sbt package