Java Spark 2.3:LEAK:ByteBuf.release()在它之前未被调用';垃圾收集
最近将逻辑从Java Spark 2.3:LEAK:ByteBuf.release()在它之前未被调用';垃圾收集,java,apache-spark,Java,Apache Spark,最近将逻辑从left\u outerjoin更新为fullouter join,我开始看到这个错误: ERROR util.ResourceLeakDetector: LEAK: ByteBuf.release() was not called before it's garbage-collected. See http://netty.io/wiki/reference-counted-objects.html for more information. Recent access rec
left\u outer
join更新为full
outer join,我开始看到这个错误:
ERROR util.ResourceLeakDetector: LEAK: ByteBuf.release() was not called before it's garbage-collected. See http://netty.io/wiki/reference-counted-objects.html for more information.
Recent access records:
Exception in thread "main" org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange SinglePartition
+- *(7) HashAggregate(keys=[], functions=[partial_count(1)], output=[count#945L])
发现了一个类似的旧错误,但我认为现在应该修复它:
完全不知道如何处理这个问题