Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/spring-mvc/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 将Spark 3.1.1作为配置单元运行时出错';S3.1.2引擎(java.lang.NoClassDefFoundError:org/apache/spark/unsafe/array/ByteArrayMethods)_Apache Spark_Hadoop_Hive_Yarn - Fatal编程技术网

Apache spark 将Spark 3.1.1作为配置单元运行时出错';S3.1.2引擎(java.lang.NoClassDefFoundError:org/apache/spark/unsafe/array/ByteArrayMethods)

Apache spark 将Spark 3.1.1作为配置单元运行时出错';S3.1.2引擎(java.lang.NoClassDefFoundError:org/apache/spark/unsafe/array/ByteArrayMethods),apache-spark,hadoop,hive,yarn,Apache Spark,Hadoop,Hive,Yarn,我正在ubuntu 20.4上运行spark 群集版本: Hadoop 3.2.2 蜂巢3.1.2 火花3.1.1 我已将spark的jar与hive的lib之间的符号链接指定为: sudo ln -s $SPARK_HOME/jars/spark-network-common_2.12-3.1.1.jar $HIVE_HOME/lib/spark-network-common_2.12-3.1.1.jar sudo ln -s $SPARK_HOME/jars/spark-core_2.1

我正在ubuntu 20.4上运行spark 群集版本:

  • Hadoop 3.2.2
  • 蜂巢3.1.2
  • 火花3.1.1
我已将spark的jar与hive的lib之间的符号链接指定为:

sudo ln -s $SPARK_HOME/jars/spark-network-common_2.12-3.1.1.jar $HIVE_HOME/lib/spark-network-common_2.12-3.1.1.jar
sudo ln -s $SPARK_HOME/jars/spark-core_2.12-3.1.1.jar $HIVE_HOME/lib/spark-core_2.12-3.1.1.jar
sudo ln -s $SPARK_HOME/jars/scala-library-2.12.10.jar $HIVE_HOME/lib/scala-library-2.12.10.jar
当运行hive并将spark设置为enging时,我得到以下错误:

Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34)'
2021-05-31T12:31:58,949 ERROR [a69d446a-f1a0-45d9-8dbc-c0fccbf718b3 main] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221)
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
        at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
        at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
        at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
        at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/unsafe/array/ByteArrayMethods
        at org.apache.spark.internal.config.package$.<init>(package.scala:1095)
        at org.apache.spark.internal.config.package$.<clinit>(package.scala)
        at org.apache.spark.SparkConf$.<init>(SparkConf.scala:654)
        at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
        at org.apache.spark.SparkConf.set(SparkConf.scala:94)
        at org.apache.spark.SparkConf.set(SparkConf.scala:83)
        at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:265)
        at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98)
        at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76)
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87)
        ... 24 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.unsafe.array.ByteArrayMethods
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        ... 34 more

2021-05-31T12:31:58,950 ERROR [a69d446a-f1a0-45d9-8dbc-c0fccbf718b3 main] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) ~[hive-cli-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) ~[hive-cli-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) ~[hive-cli-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) ~[hive-cli-3.1.2.jar:3.1.2]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_292]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_292]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_292]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_292]
        at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.2.2.jar:?]
        at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.2.2.jar:?]
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/unsafe/array/ByteArrayMethods
        at org.apache.spark.internal.config.package$.<init>(package.scala:1095) ~[spark-core_2.12-3.1.1.jar:3.1.1]
        at org.apache.spark.internal.config.package$.<clinit>(package.scala) ~[spark-core_2.12-3.1.1.jar:3.1.1]
        at org.apache.spark.SparkConf$.<init>(SparkConf.scala:654) ~[spark-core_2.12-3.1.1.jar:3.1.1]
        at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala) ~[spark-core_2.12-3.1.1.jar:3.1.1]
        at org.apache.spark.SparkConf.set(SparkConf.scala:94) ~[spark-core_2.12-3.1.1.jar:3.1.1]
        at org.apache.spark.SparkConf.set(SparkConf.scala:83) ~[spark-core_2.12-3.1.1.jar:3.1.1]
        at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:265) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:98) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76) ~[hive-exec-3.1.2.jar:3.1.2]
        at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87) ~[hive-exec-3.1.2.jar:3.1.2]
        ... 24 more
未能执行spark任务,异常为'org.apache.hadoop.hive.ql.metadata.HiveException(未能为spark会话57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34创建spark客户端)'
2021-05-31T12:31:58949错误[a69d446a-f1a0-45d9-8dbc-c0fccbf718b3 main]spark.SparkTask:未能执行spark任务,异常为“org.apache.hadoop.hive.ql.metadata.HiveException”(未能为spark会话57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34创建spark客户端)
org.apache.hadoop.hive.ql.metadata.HiveException:未能为Spark会话57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34创建Spark客户端
位于org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221)
位于org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
位于org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
在org.apache.hadoop.hive.ql.exec.SparkUtilities.getSparkSession(SparkUtilities.java:136)上
位于org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
位于org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
位于org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
位于org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
位于org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
位于org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
位于org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
位于org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
位于org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
位于org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
位于org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
位于org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
位于org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
位于org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
位于org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
位于org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
位于org.apache.hadoop.util.RunJar.run(RunJar.java:323)
位于org.apache.hadoop.util.RunJar.main(RunJar.java:236)
原因:java.lang.NoClassDefFoundError:org/apache/spark/unsafe/array/ByteArrayMethods
位于org.apache.spark.internal.config.package$(package.scala:1095)
位于org.apache.spark.internal.config.package$(package.scala)
位于org.apache.spark.SparkConf$(SparkConf.scala:654)
位于org.apache.spark.SparkConf$(SparkConf.scala)
位于org.apache.spark.SparkConf.set(SparkConf.scala:94)
位于org.apache.spark.SparkConf.set(SparkConf.scala:83)
位于org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.generateSparkConf(HiveSparkClientFactory.java:265)
位于org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient。(RemoteHiveSparkClient.java:98)
位于org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76)
位于org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87)
... 还有24个
原因:java.lang.ClassNotFoundException:org.apache.spark.unsafe.array.ByteArrayMethods
位于java.net.URLClassLoader.findClass(URLClassLoader.java:382)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:418)
位于sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 34多
2021-05-31T12:31:58950错误[a69d446a-f1a0-45d9-8dbc-c0fccbf718b3 main]spark.SparkTask:未能执行spark任务,异常为“org.apache.hadoop.hive.ql.metadata.HiveException(未能为spark会话57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34创建spark客户端)”
org.apache.hadoop.hive.ql.metadata.HiveException:未能为Spark会话57f08f6b-02b7-4c3d-bf8c-4ec351a5fd34创建Spark客户端
在org.apache.hadoop.hive.ql.exec.spark.sessionimpl.getHiveException(SparkSessionImpl.java:221)~[hive-exec-3.1.2.jar:3.1.2]
在org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)~[hive-exec-3.1.2.jar:3.1.2]
在org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)~[hive-exec-3.1.2.jar:3.1.2]
在org.apache.hadoop.hive.ql.exec.SparkUtilities.getSparkSession(SparkUtilities.java:136)~[hive-exec-3.1.2.jar:3.1.2]
在org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)~[hive-exec-3.1.2.jar:3.1.2]
在org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)~[hive-exec-3.1.2.jar:3.1.2]
在org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)~[hive-exec-3.1.2.jar:3.1.2]
在org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)~[hive-exec-3.1.2.jar:3.1.2]
在org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)~[hive-exec-3.1.2.jar
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-unsafe_2.11</artifactId>
<version>2.3.0</version>
<scope>compile</scope>
</dependency>