Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何使用Maven构建Spark 1.2(给出java.io.IOException:无法运行程序“javac”)?_Maven_Apache Spark_Pyspark - Fatal编程技术网

如何使用Maven构建Spark 1.2(给出java.io.IOException:无法运行程序“javac”)?

如何使用Maven构建Spark 1.2(给出java.io.IOException:无法运行程序“javac”)?,maven,apache-spark,pyspark,Maven,Apache Spark,Pyspark,我正在尝试用Maven构建Spark 1.2。我的目标是在Hadoop2.2上使用PySpark和Thread 我发现这只有通过与Maven建立Spark才能实现。首先,这是真的吗 如果这是真的,那么下面日志中的问题是什么?我如何纠正这个问题 C:\Spark\spark-1.2.0>mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package Picked up _JAVA_OPTIONS: -Djava

我正在尝试用Maven构建Spark 1.2。我的目标是在Hadoop2.2上使用PySpark和Thread

我发现这只有通过与Maven建立Spark才能实现。首先,这是真的吗

如果这是真的,那么下面日志中的问题是什么?我如何纠正这个问题

C:\Spark\spark-1.2.0>mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests
clean package
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] Spark Project Parent POM
[INFO] Spark Project Networking
[INFO] Spark Project Shuffle Streaming Service
[INFO] Spark Project Core
[INFO] Spark Project Bagel
[INFO] Spark Project GraphX
[INFO] Spark Project Streaming
[INFO] Spark Project Catalyst
[INFO] Spark Project SQL
[INFO] Spark Project ML Library
[INFO] Spark Project Tools
[INFO] Spark Project Hive
[INFO] Spark Project REPL
[INFO] Spark Project YARN Parent POM
[INFO] Spark Project YARN Stable API
[INFO] Spark Project Assembly
[INFO] Spark Project External Twitter
[INFO] Spark Project External Flume Sink
[INFO] Spark Project External Flume
[INFO] Spark Project External MQTT
[INFO] Spark Project External ZeroMQ
[INFO] Spark Project External Kafka
[INFO] Spark Project Examples
[INFO] Spark Project YARN Shuffle Service
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project Parent POM 1.2.0
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-parent ---
[INFO] Deleting C:\Spark\spark-1.2.0\target
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ spark-parent
 ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-
parent ---
[INFO] Source directory: C:\Spark\spark-1.2.0\src\main\scala added.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-parent --
-
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-parent
 ---
[INFO] No sources to compile
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-scala-test-sources
) @ spark-parent ---
[INFO] Test Source directory: C:\Spark\spark-1.2.0\src\test\scala added.
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) @ spa
rk-parent ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-dependency-plugin:2.9:build-classpath (default) @ spark-parent
---
[INFO] Wrote classpath file 'C:\Spark\spark-1.2.0\target\spark-test-classpath.tx
t'.
[INFO]
[INFO] --- gmavenplus-plugin:1.2:execute (default) @ spark-parent ---
[INFO] Using Groovy 2.3.7 to perform execute.
[INFO]
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ spark-p
arent ---
[INFO]
[INFO] --- maven-shade-plugin:2.2:shade (default) @ spark-parent ---
[INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO]
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar) @ spark-par
ent ---
[INFO]
[INFO] --- scalastyle-maven-plugin:0.4.0:check (default) @ spark-parent ---
[WARNING] sourceDirectory is not specified or does not exist value=C:\Spark\spar
k-1.2.0\src\main\scala
Saving to outputFile=C:\Spark\spark-1.2.0\scalastyle-output.xml
Processed 0 file(s)
Found 0 errors
Found 0 warnings
Found 0 infos
Finished in 32 ms
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project Networking 1.2.0
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-network-common_2
.10 ---
[INFO] Deleting C:\Spark\spark-1.2.0\network\common\target
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ spark-networ
k-common_2.10 ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-
network-common_2.10 ---
[INFO] Source directory: C:\Spark\spark-1.2.0\network\common\src\main\scala adde
d.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-network-c
ommon_2.10 ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-netw
ork-common_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory C:\Spark\spark-1.2.0\network\common\s
rc\main\resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-networ
k-common_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal increm
ental compile
[INFO] Using incremental compilation
[INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null
)
[INFO] Compiling 42 Java sources to C:\Spark\spark-1.2.0\network\common\target\s
cala-2.10\classes...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [  5.267 s]
[INFO] Spark Project Networking ........................... FAILURE [  1.922 s]
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
[INFO] Spark Project Core ................................. SKIPPED
[INFO] Spark Project Bagel ................................ SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Parent POM ...................... SKIPPED
[INFO] Spark Project YARN Stable API ...................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 8.048 s
[INFO] Finished at: 2015-02-09T10:17:47+08:00
[INFO] Final Memory: 49M/331M
[INFO] ------------------------------------------------------------------------
[**ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compi
le (scala-compile-first) on project spark-network-common_2.10: wrap: java.io.IOE
xception: Cannot run program "javac": CreateProcess error=2, The system cannot f
ind the file specified -> [Help 1]**
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e swit
ch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please rea
d the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
xception
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn <goals> -rf :spark-network-common_2.10
C:\Spark\Spark-1.2.0>mvn-Pyarn-Phadoop-2.2-Dhadoop.version=2.2.0-DskipTests
清洁包装
拾取的JAVA选项:-Djava.net.preferIPv4Stack=true
[信息]正在扫描项目。。。
[信息]------------------------------------------------------------------------
[信息]反应堆建造顺序:
[信息]
[信息]Spark项目母公司POM
[信息]Spark项目网络
[信息]Spark项目洗牌流媒体服务
[信息]Spark项目核心
[信息]Spark项目百吉饼
[信息]Spark项目图
[信息]Spark项目流媒体
[信息]星火项目催化剂
[信息]Spark项目SQL
[信息]Spark项目ML库
[信息]Spark项目工具
[信息]Spark项目蜂巢
[信息]Spark项目回复
[信息]Spark项目纱线母公司POM
[信息]Spark项目纱线稳定API
[信息]Spark项目组装
[信息]Spark项目外部推特
[信息]Spark项目外部水槽水槽
[信息]Spark项目外部水槽
[信息]Spark项目外部MQTT
[信息]Spark项目外部ZeroMQ
[信息]Spark项目外部卡夫卡
[信息]Spark项目示例
[信息]星火项目纱线洗牌服务
[信息]
[信息]------------------------------------------------------------------------
[信息]构建Spark项目父项目POM 1.2.0
[信息]------------------------------------------------------------------------
[信息]
[信息]---maven clean插件:2.5:clean(默认清洁)@spark parent---
[信息]删除C:\Spark\Spark-1.2.0\target
[信息]
[信息]---maven enforcer插件:1.3.1:enforce(强制版本)@spark parent
---
[信息]
[信息]---build helper maven插件:1.8:添加源代码(添加scala源代码)@spark-
母公司---
[信息]源目录:C:\Spark\Spark-1.2.0\src\main\scala已添加。
[信息]
[信息]---maven远程资源插件:1.5:process(默认)@spark parent--
-
[信息]
[信息]---scala maven插件:3.2.0:compile(scala先编译)@spark parent
---
[信息]没有要编译的源
[信息]
[信息]---build helper maven插件:1.8:添加测试源(添加scala测试源
)@spark-parent---
[信息]测试源目录:C:\Spark\Spark-1.2.0\src\Test\scala已添加。
[信息]
[信息]---scala maven插件:3.2.0:testCompile(scala测试先编译)@spa
rk父母---
[信息]没有要编译的源
[信息]
[信息]---maven依赖插件:2.9:构建类路径(默认)@spark parent
---
[INFO]编写了类路径文件'C:\Spark\Spark-1.2.0\target\Spark-test-classpath.tx
t’。
[信息]
[信息]--gmavenplus插件:1.2:execute(默认)@spark parent---
[INFO]使用Groovy 2.3.7执行。
[信息]
[信息]---maven站点插件:3.3:附加描述符(附加描述符)@spark-p
阿伦特---
[信息]
[信息]---maven shade插件:2.2:shade(默认)@spark parent---
[信息]包括org.spark project.spark:unused:jar:1.0.0在阴影中。
[信息]将原始工件替换为着色工件。
[信息]
[信息]---maven源插件:2.2.1:jar无fork(创建源jar)@spark par
耳鼻喉科---
[信息]
[信息]---scalastyle maven插件:0.4.0:检查(默认)@spark parent---
[警告]未指定sourceDirectory或sourceDirectory不存在值=C:\Spark\spar
k-1.2.0\src\main\scala
保存到outputFile=C:\Spark\Spark-1.2.0\scalastyle-output.xml
已处理0个文件
发现0个错误
找到0个警告
找到0个信息
在32毫秒内完成
[信息]
[信息]------------------------------------------------------------------------
[信息]建设Spark项目网络1.2.0
[信息]------------------------------------------------------------------------
[信息]
[信息]---maven clean插件:2.5:clean(默认清洁)@spark-network-common_2
.10 ---
[信息]删除C:\Spark\Spark-1.2.0\network\common\target
[信息]
[信息]---maven enforcer插件:1.3.1:enforce(强制版本)@spark networ
k-common_2.10---
[信息]
[信息]---build helper maven插件:1.8:添加源代码(添加scala源代码)@spark-
网络公共_2.10---
[信息]源目录:C:\Spark\Spark-1.2.0\network\common\src\main\scala adde
D
[信息]
[信息]---maven远程资源插件:1.5:进程(默认)@spark-network-c
ommon_2.10---
[信息]
[信息]---maven资源插件:2.6:resources(默认资源)@spark netw
ork-common_2.10---
[信息]使用“UTF-8”编码复制筛选的资源。
[信息]跳过不存在的资源目录C:\Spark\Spark-1.2.0\network\common\s
rc\main\resources
[信息]正在复制3个资源
[信息]
[信息]---scala maven插件:3.2.0:编译(scala先编译)@spark networ
k-common_2.10---
[警告]锌服务器在端口3030处不可用-正在恢复到正常增量
内编译
[信息]使用增量编译
[信息]编译器插件:BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null
)
[信息]正在将42个Java源代码编译为C:\Spark\Spark-1.2.0\network\common\target\s
cala-2.10\类。。。
[信息]------------------------------------------------------------------------
[信息]反应堆概要:
[信息]
[信息]Spark项目母公司POM。。。。。。。。。。。。。。。。。。。。。。。。。。。成功[5.267秒]
[信息]Spark项目网络化。。。。。。。。。。。。。。。。。。。。。。。。。。。故障[1.922秒]
[信息]Spark项目洗牌流媒体服务。。。。。。。。。。。。跳过
[信息]Spark项目核心。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。跳过
[信息]星火项目百吉饼。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。跳过
[信息]Spark项目图。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。跳过
[信息]Spark项目流媒体。。。。。。。。。。。。。。。。。。。。。。。。。。。。跳过
[信息]Spark项目催化剂。。。。。。。。。。。。。。。。。。。。。。。。。。。。。跳过
[信息]Spark项目SQL。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。跳过
[信息]Spark项目ML库。。。。。。。。。。。。。。。。。。。。。。。。。。。跳过
[信息]Spark项目工具。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。。跳过