Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 不知道如何解决spark cluster中GC中的问题,有人能解释一下如何处理ParOldGen和PSYoungGen吗?_Java_Apache Spark_Garbage Collection - Fatal编程技术网

Java 不知道如何解决spark cluster中GC中的问题,有人能解释一下如何处理ParOldGen和PSYoungGen吗?

Java 不知道如何解决spark cluster中GC中的问题,有人能解释一下如何处理ParOldGen和PSYoungGen吗?,java,apache-spark,garbage-collection,Java,Apache Spark,Garbage Collection,检查了提供的所有配置:shufflespark.default.parallelism,spark.sql.shuffle.partitions 以及所有必需的内存选项,如可执行内存和驱动程序内存。我有足够的内存是64 GB左右,但不知道为什么会出现 我想知道它是否可以通过内存配置修复。它执行以前的所有任务,但在31个任务上失败。查询大小很大,对于较小的查询运行良好 logger.debug(String.format("Executing SQL %s", taskExec));

检查了提供的所有配置:shuffle
spark.default.parallelism,spark.sql.shuffle.partitions
以及所有必需的内存选项,如可执行内存驱动程序内存。我有足够的内存是64 GB左右,但不知道为什么会出现

我想知道它是否可以通过内存配置修复。它执行以前的所有任务,但在31个任务上失败。查询大小很大,对于较小的查询运行良好

    logger.debug(String.format("Executing SQL %s", taskExec));
    Dataset<Row> dfTmp = null;
    dfTmp = sqlContext.sql(taskExec);


AdaptiveSizeStop: collection: 107 
[PSYoungGen: 2917987K->2917987K(3547136K)] [ParOldGen: 8387375K- 
>8387375K(8388608K)] 11305363K->11305363K(11935744K), [Metaspace: 
 72368K->72368K(1114112K)], 0.4457447 secs] [Times: user=2.27 sys=0.00, 
 real=0.44 secs] 
# java.lang.OutOfMemoryError: Java heap space
# -XX:OnOutOfMemoryError="kill %p"
#   Executing /bin/sh -c "kill 9085"...
282.474: [Full GC (Ergonomics) 282.535: [SoftReference, 0 refs, 0.0000665 secs]282.535: [WeakReference, 956 refs, 0.0001140 secs]282.535: [FinalReference, 1092 refs, 0.0000635 secs]282.535: [PhantomReference, 0 refs, 38 refs, 0.0000145 secs]282.536: [JNI Weak Reference, 0.0000145 secs]AdaptiveSizeStart: 283.597 collection: 108 
PSAdaptiveSizePolicy::compute_eden_space_size limits: desired_eden_size: 3086984786 old_eden_size: 3023044608 eden_limit: 3023044608 cur_eden: 2991587328 max_eden_size: 3023044608 avg_young_live: 2739392512
PSAdaptiveSizePolicy::compute_eden_space_size: gc time limit gc_cost: 1.000000  GCTimeLimit: 98
PSAdaptiveSizePolicy::compute_eden_space_size: costs minor_time: 0.144870 major_cost: 0.975053 mutator_cost: 0.000000 throughput_goal: 0.990000 live_space: 11575527424 free_space: 5652873216 old_eden_size: 3023044608 desired_eden_size: 3023044608
PSAdaptiveSizePolicy::compute_old_gen_free_space limits: desired_promo_size: 3143082161 promo_limit: 2629828608 free_in_old_gen: 20183040 max_old_gen_size: 8589934592 avg_old_live: 8569751552
PSAdaptiveSizePolicy::compute_old_gen_free_space: gc time limit gc_cost: 1.000000  GCTimeLimit: 98
PSAdaptiveSizePolicy::compute_old_gen_free_space: costs minor_time: 0.144870 major_cost: 0.975053 mutator_cost: 0.000000 throughput_goal: 0.990000 live_space: 11577579520 free_space: 5652873216 old_promo_size: 2629828608 desired_promo_size: 2629828608
AdaptiveSizeStop: collection: 108 
[PSYoungGen: 2921472K->778705K(3547136K)] [ParOldGen: 8387375K->8386929K(8388608K)] 11308847K->9165634K(11935744K), [Metaspace: 72370K->72370K(1114112K)], 1.1228849 secs] [Times: user=8.59 sys=0.74, real=1.12 secs] 
10:51:46.868 [Executor task launch worker for task 9593] ERROR org.apache.spark.executor.Executor - Exception in task 30.0 in stage 144.0 (TID 9593)
java.lang.OutOfMemoryError: Java heap space
    at java.util.Arrays.copyOf(Arrays.java:3332)
    at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
    at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
    at java.lang.StringBuilder.append(StringBuilder.java:136)
    at scala.collection.mutable.StringBuilder.append(StringBuilder.scala:200)
    at org.apache.spark.sql.catalyst.util.package$$anonfun$sideBySide$1.apply(package.scala:113)
    at org.apache.spark.sql.catalyst.util.package$$anonfun$sideBySide$1.apply(package.scala:112)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
    at scala.collection.AbstractTraversable.map(Traversable.scala:104)
    at org.apache.spark.sql.catalyst.util.package$.sideBySide(package.scala:112)
    at org.apache.spark.sql.catalyst.util.package$.sideBySide(package.scala:104)
    at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$5.apply(RuleExecutor.scala:137)
    at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$5.apply(RuleExecutor.scala:138)
    at org.apache.spark.internal.Logging$class.logDebug(Logging.scala:58)
    at org.apache.spark.sql.catalyst.rules.RuleExecutor.logDebug(RuleExecutor.scala:40)
    at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:134)
    at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:76)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:76)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$$anonfun$canonicalize$1.apply(GenerateUnsafeProjection.scala:354)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$$anonfun$canonicalize$1.apply(GenerateUnsafeProjection.scala:354)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
    at scala.collection.immutable.List.map(List.scala:285)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.canonicalize(GenerateUnsafeProjection.scala:354)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.generate(GenerateUnsafeProjection.scala:362)
10:51:46.900 [SIGTERM handler] ERROR org.apache.spark.executor.CoarseGrainedExecutorBackend - RECEIVED SIGNAL TERM
10:51:46.918 [Thread-2] INFO org.apache.spark.storage.DiskBlockManager - Shutdown hook called
logger.debug(String.format(“正在执行SQL%s”,taskExec));
数据集dfTmp=null;
dfTmp=sqlContext.sql(taskExec);
AdaptiveSizeStop:集合:107
[PSYoungGen:2917987K->2917987K(3547136K)][ParOldGen:8387375K-
>8387375K(8388608K)]11305363K->11305363K(11935744K),[元空间:
72368K->72368K(1114112K)],0.4457447秒][次:用户=2.27系统=0.00,
实际值=0.44秒]
#java.lang.OutOfMemoryError:java堆空间
#-XX:OnOutOfMemoryError=“杀死%p”
#正在执行/bin/sh-c“kill 9085”。。。
282.474:[完全GC(人体工程学)282.535:[软参考,0参考,0.0000665秒]282.535:[WeakReference,956参考,0.0001140秒]282.535:[最终参考,1092参考,0.0000635秒]282.535:[幻影参考,0参考,38参考,0.0000145秒]282.536:[JNI弱参考,0.0000145秒]AdaptiveSizeStart:283.597收藏:108
PSAdaptiveSizePolicy::compute_eden_space_大小限制:所需的_eden_大小:3086984786 old_eden_大小:3023044608 eden_限制:3023044608 cur_eden:2991587328最大_eden大小:3023044608 avg_young_live:2739392512
PSAdaptiveSizePolicy::计算空间大小:gc时间限制gc成本:1.000000 GCTimeLimit:98
PSAdaptiveSizePolicy::compute_eden_space_大小:成本次要_时间:0.144870主要成本:0.975053变异器成本:0.000000吞吐量目标:0.990000生存空间:11575527424自由空间:5652873216旧_eden_大小:3023044608所需_eden_大小:3023044608
PSAdaptiveSizePolicy::compute_old_gen_free_space limits:所需的_promo_大小:3143082161 promo_limit:2629828608 free_in_old_gen:20183040 max_old_gen_大小:8589934592 avg_old_live:8569751552
PSAdaptiveSizePolicy::compute_old_gen_free_空间:gc时间限制gc成本:1.000000 GCTimeLimit:98
PSAdaptiveSizePolicy::compute_old_gen_free_space:成本次要时间:0.144870主要成本:0.975053变异器成本:0.000000吞吐量目标:0.990000 live_space:11577579520 free_space:5652873216 old_促销尺寸:2629828608期望促销尺寸:2629828608
AdaptiveSizeStop:收藏:108
[PSYoungGen:2921472K->778705K(3547136K)][ParOldGen:8387375K->8386929K(8388608K)]11308847K->9165634K(11935744K),[Metaspace:72370K->72370K(1114112K)],1.1228849秒][次:用户=8.59系统=0.74,实际=1.12秒]
10:51:46.868[Executor task launch worker for task 9593]错误org.apache.spark.Executor.Executor-任务30.0中144.0阶段的异常(TID 9593)
java.lang.OutOfMemoryError:java堆空间
位于java.util.Arrays.copyOf(Arrays.java:3332)
位于java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
位于java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
在java.lang.StringBuilder.append(StringBuilder.java:136)
位于scala.collection.mutable.StringBuilder.append(StringBuilder.scala:200)
位于org.apache.spark.sql.catalyst.util.package$$anonfun$sideBySide$1.apply(package.scala:113)
位于org.apache.spark.sql.catalyst.util.package$$anonfun$sideBySide$1.apply(package.scala:112)
在scala.collection.TraversableLike$$anonfun$map$1.apply处(TraversableLike.scala:234)
在scala.collection.TraversableLike$$anonfun$map$1.apply处(TraversableLike.scala:234)
位于scala.collection.mutable.resizeblearray$class.foreach(resizeblearray.scala:59)
位于scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
位于scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
位于scala.collection.AbstractTraversable.map(Traversable.scala:104)
位于org.apache.spark.sql.catalyst.util.package$.sideBySide(package.scala:112)
位于org.apache.spark.sql.catalyst.util.package$.sideBySide(package.scala:104)
位于org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$5.apply(RuleExecutor.scala:137)
位于org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$5.apply(RuleExecutor.scala:138)
位于org.apache.spark.internal.Logging$class.logDebug(Logging.scala:58)
位于org.apache.spark.sql.catalyst.rules.RuleExecutor.logDebug(RuleExecutor.scala:40)
位于org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:134)
位于org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:76)
位于scala.collection.immutable.List.foreach(List.scala:381)
位于org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:76)
在org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$$anonfun$canonicalize$1.apply上(GenerateUnsafeProjection.scala:354)
在org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$$anonfun$canonicalize$1.apply上(GenerateUnsafeProjection.scala:354)
在scala.collection.TraversableLike$$anonfun$map$1.apply处(TraversableLike.scala:234)
在scala.collection.TraversableLike$$anonfun$map$1.apply处(TraversableLike.scala:234)
位于scala.collection.immutable.List.foreach(List.scala:381)
位于scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
位于scala.collection.immutable.List.map(List.scala:285)
在org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.canonicalize(GenerateUnsafeProjection.scala:354)
位于org.apache.spark.sql.catalyst.expressions.codegen.GenerateUnsafeProjection$.generate(GenerateUnsafeProjection.scala:362)
10:51:46.900[SIGTERM handler]错误org.apache.spark.executor.RoughGrainedExecutorBackend-已收到
java.lang.OutOfMemoryError: Java heap space
    at java.util.Arrays.copyOf(Arrays.java:3332)
    at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
    at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
    at java.lang.StringBuilder.append(StringBuilder.java:136)