Hadoop Spark对象运行时错误
在我的本地系统中运行程序时出错为 我的内存大小是3GB,需要解决方案Hadoop Spark对象运行时错误,hadoop,apache-spark,Hadoop,Apache Spark,在我的本地系统中运行程序时出错为 我的内存大小是3GB,需要解决方案 Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configura
Exception in thread "main" java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216)
at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:432)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
at SparkCore.cartesianTransformation$.main(cartesianTransformation.scala:11)
at SparkCore.cartesianTransformation.main(cartesianTransformation.scala)
线程“main”java.lang.IllegalArgumentException中的异常:系统内存259522560必须至少为471859200。请在spark配置中使用--driver memory选项或spark.driver.memory增加堆大小。
位于org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216)
位于org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198)
位于org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)
在org.apache.spark.SparkEnv$.createDriverEnv上(SparkEnv.scala:174)
位于org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
位于org.apache.spark.SparkContext(SparkContext.scala:432)
位于org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
位于org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
位于org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
位于scala.Option.getOrElse(Option.scala:121)
位于org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
在SparkCore.cartesianTransformation$.main(cartesianTransformation.scala:11)
在SparkCore.cartesianTransformation.main(cartesianTransformation.scala)
您的spark驱动程序似乎在小内存中运行,请尝试增加驱动程序内存的大小。 您可以使用
--driver memory 4g
向驱动程序提供内存大小
希望这有帮助 如果您不发布任何代码,我们将如何帮助您?您是否可以在web UI中看到已使用了多少内存?并尝试在VM参数中添加-Xmx1024m-Xms512m