使用TensorflowOnSpark的Hadoop/Spark上的Tensorflow

使用TensorflowOnSpark的Hadoop/Spark上的Tensorflow,hadoop,apache-spark,tensorflow,Hadoop,Apache Spark,Tensorflow,我是Spark和Tensorflow的新手,正在尝试在集群上执行一个简单的示例。我使用Spark 2.1.0和Hadoop 2.7.4(使用Spark for Hadoop 2.7) 我在谷歌云上安装了Hadoop/Spark,其中有3台机器似乎可以很好地完成其他任务 我试图执行: ${SPARK_HOME}/bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode clust

我是Spark和Tensorflow的新手,正在尝试在集群上执行一个简单的示例。我使用Spark 2.1.0和Hadoop 2.7.4(使用Spark for Hadoop 2.7)

我在谷歌云上安装了Hadoop/Spark,其中有3台机器似乎可以很好地完成其他任务

我试图执行:

${SPARK_HOME}/bin/spark-submit --class org.apache.spark.examples.SparkPi
--master yarn   --deploy-mode cluster
--driver-memory 1g lib/spark-examples*.jar 10
我得到以下错误

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.hadoop.yarn.util.Apps.crossPlatformify(Ljava/lang/String;)Ljava/lang/String;
  at org.apache.hadoop.mapreduce.MRJobConfig.<clinit>(MRJobConfig.java:695)
  at sun.misc.Unsafe.ensureClassInitialized(Native Method)
  at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
  at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:156)
  at java.lang.reflect.Field.acquireFieldAccessor(Field.java:1088)
  at java.lang.reflect.Field.getFieldAccessor(Field.java:1069)
  at java.lang.reflect.Field.get(Field.java:393)
  at org.apache.spark.deploy.yarn.Client$$anonfun$24.apply(Client.scala:1305)
  at org.apache.spark.deploy.yarn.Client$$anonfun$24.apply(Client.scala:1303)
  at scala.util.Try$.apply(Try.scala:192)
  at org.apache.spark.deploy.yarn.Client$.getDefaultMRApplicationClasspath(Client.scala:1303)
  at org.apache.spark.deploy.yarn.Client$.getMRAppClasspath(Client.scala:1280)
  at org.apache.spark.deploy.yarn.Client$.populateHadoopClasspath(Client.scala:1265)
  at org.apache.spark.deploy.yarn.Client$.populateClasspath(Client.scala:1380)
  at org.apache.spark.deploy.yarn.Client.setupLaunchEnv(Client.scala:755)
  at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:867)
  at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:170)
  at org.apache.spark.deploy.yarn.Client.run(Client.scala:1154)
  at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1213)
  at org.apache.spark.deploy.yarn.Client.main(Client.scala)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
  at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
  at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
线程“main”java.lang.NoSuchMethodError中出现异常: org.apache.hadoop.warn.util.Apps.crossPlatformify(Ljava/lang/String;)Ljava/lang/String; 位于org.apache.hadoop.mapreduce.MRJobConfig.(MRJobConfig.java:695) 在sun.misc.Unsafe.EnsureClassified(本机方法) 位于sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43) 位于sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:156) 位于java.lang.reflect.Field.acquireFieldAccessor(Field.java:1088) 位于java.lang.reflect.Field.getFieldAccessor(Field.java:1069) 位于java.lang.reflect.Field.get(Field.java:393) 位于org.apache.spark.deploy.warn.Client$$anonfun$24.apply(Client.scala:1305) 位于org.apache.spark.deploy.warn.Client$$anonfun$24.apply(Client.scala:1303) 在scala.util.Try$.apply(Try.scala:192) 位于org.apache.spark.deploy.warn.Client$.getDefaultMRApplicationClasspath(Client.scala:1303) 位于org.apache.spark.deploy.warn.Client$.getMRAppClasspath(Client.scala:1280) 位于org.apache.spark.deploy.warn.Client$.populateHadoopClasspath(Client.scala:1265) 位于org.apache.spark.deploy.warn.Client$.populateClasspath(Client.scala:1380) 位于org.apache.spark.deploy.warn.Client.setupLaunchEnv(Client.scala:755) 位于org.apache.spark.deploy.warn.Client.createContainerLaunchContext(Client.scala:867) 位于org.apache.spark.deploy.warn.Client.submitApplication(Client.scala:170) 位于org.apache.spark.deploy.warn.Client.run(Client.scala:1154) 位于org.apache.spark.deploy.warn.Client$.main(Client.scala:1213) 位于org.apache.spark.deploy.warn.Client.main(Client.scala) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处 位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中 位于java.lang.reflect.Method.invoke(Method.java:498) 位于org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) 位于org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) 位于org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) 位于org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) 位于org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 我知道缺少的方法可以在hadoop-common-2.7.3.jar中找到 存在于 /opt/spark-2.1.0-bin-hadoop2.7/jars/hadoop-common-2.7.3.jar 和/opt/hadoop-2.7.4/share/hadoop/common/hadoop-common-2.7.4.jar 每台机器

我还尝试使用:--jars和--driver-library-path作为jar的路径,但它们并没有什么区别

使用--master client(而不是纱线)执行上述命令不会产生错误(执行得很好)

我想这是一个配置问题,但我不确定在哪里寻找