Apache spark 火花纺纱机

Apache spark 火花纺纱机,apache-spark,yarn,Apache Spark,Yarn,我有一个火花工作,但当我试图执行它的纱线,有一个例外,我不知道如何解决它或在哪里我需要进行更新。错误如下: 017-11-30 10:28:49,952 [main] INFO org.apache.spark.deploy.yarn.Client - Source and destination file systems are the same. Not copying hdfs://servername1.domain.net:8020/user/oozie/share/lib/lib

我有一个火花工作,但当我试图执行它的纱线,有一个例外,我不知道如何解决它或在哪里我需要进行更新。错误如下:

017-11-30 10:28:49,952 [main] INFO  org.apache.spark.deploy.yarn.Client  - Source and destination file systems are the same. Not copying hdfs://servername1.domain.net:8020/user/oozie/share/lib/lib_20171123121217/spark2/spark-yarn_2.11-2.2.0.cloudera1.jar
2017-11-30 10:28:49,972 [main] INFO  org.apache.spark.deploy.yarn.Client  - Deleted staging directory hdfs://servername1:8020/user/pe3016/.sparkStaging/application_1511521415490_0216
2017-11-30 10:28:49,974 [main] ERROR org.apache.spark.SparkContext  - Error initializing SparkContext.
java.lang.IllegalArgumentException: Wrong FS: hdfs://servername.domain.net:8020/user/oozie/share/lib/lib_20171123121217/spark2/spark-yarn_2.11-2.2.0.cloudera1.jar, expected: hdfs://servername:8020

我正在使用cloudera 5.12.1

检查core-site.xml中的fs.default.name属性。它有什么值?请检查core-site.xml中的fs.default.name属性。它有什么价值?