Hadoop org.apache.hive.com.esotericsoftware.kryo.KryoException:遇到未注册的类ID:21

Hadoop org.apache.hive.com.esotericsoftware.kryo.KryoException:遇到未注册的类ID:21,hadoop,apache-spark,hive,yarn,Hadoop,Apache Spark,Hive,Yarn,我有带spark(1.6.1)、hdfs和hive(2.1)的纱线簇。到今天为止,我的工作流程在几个月内运行良好(在代码/环境中没有任何更改)。我开始出现如下错误: org.apache.hive.com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 21 Serialization trace: outputFileFormatClass (org.apache.hadoop.hive.ql.pl

我有带spark(1.6.1)、hdfs和hive(2.1)的纱线簇。到今天为止,我的工作流程在几个月内运行良好(在代码/环境中没有任何更改)。我开始出现如下错误:

org.apache.hive.com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 21
Serialization trace:
outputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
invertedWorkGraph (org.apache.hadoop.hive.ql.plan.SparkWork)
    at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:119)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:238)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:226)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:745)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:113)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:131)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
    at org.apache.hadoop.hive.ql.exec.spark.KryoSerializer.deserialize(KryoSerializer.java:49)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient$JobStatusJob.call(RemoteHiveSparkClient.java:318)
    at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:366)
    at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:335)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
使用hive,我可以进行简单的选择,但是每一个需要spark的操作都会在控制台中以
错误结束:处理语句时出错:失败:执行错误,从org.apache.hadoop.hive.ql.exec.spark.SparkTask(state=08S01,code=3)
返回代码3,并在上面的日志中返回错误。 现在我的每个蜂巢数据库都瘫痪了(我几乎没有)。我一整天都在试图解决这个问题,但什么都做不了(蜂巢重启、纱线节点重启、更换纱线主节点)

你认为问题的原因是什么?如何解决呢?

我找到了答案


重新启动hive-server2一小段时间后,没有出现错误:
org.apache.hive.com.esotericsoftware.kryo.KryoException:遇到未注册的类ID:26
我出现错误:
org.apache.hive.com.esotericsoftware.kryo.KryoException:找不到类:org.apache.hadoop.hive.ql.io.RCFileOutputFormat
。对于第二种形式,很明显,在节点上执行的spark在类路径上没有一些JAR。我不知道原因,为什么spark在一瞬间无法加载这些JAR,但在手动将它们复制到每个节点上的lib文件夹并重新启动节点后,一切都恢复了正常。

开源,兄弟。我感觉到你的痛苦。