Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark Spark:如何获取所有配置参数_Apache Spark - Fatal编程技术网

Apache spark Spark:如何获取所有配置参数

Apache spark Spark:如何获取所有配置参数,apache-spark,Apache Spark,我试图找出我的spark应用程序执行时使用的配置参数。是否有方法获取所有参数,包括默认参数 例如,如果在配置单元控制台上执行“set;”,它将列出完整的配置单元配置。我正在为Spark寻找类似的操作/命令 更新: 我试过karthik manchala提出的解决方案。我得到了这些结果。据我所知,这些并不是所有的参数。这个spark.shuffle.memoryFraction(还有更多)不见了 scala> println(sc.getConf.getAll.deep.mkString("

我试图找出我的spark应用程序执行时使用的配置参数。是否有方法获取所有参数,包括默认参数

例如,如果在配置单元控制台上执行“set;”,它将列出完整的配置单元配置。我正在为Spark寻找类似的操作/命令

更新: 我试过karthik manchala提出的解决方案。我得到了这些结果。据我所知,这些并不是所有的参数。这个spark.shuffle.memoryFraction(还有更多)不见了

scala> println(sc.getConf.getAll.deep.mkString("\n"));
(spark.eventLog.enabled,true)
(spark.dynamicAllocation.minExecutors,1)
(spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS,...)
(spark.repl.class.uri,http://...:54157)
(spark.tachyonStore.folderName,spark-46d43c17-b0b3-4b61-a017-a186075849ca)
(spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES,http://...)
(spark.driver.host,...l)
(spark.yarn.jar,local:/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/spark/lib/spark-assembly.jar)
(spark.yarn.historyServer.address,http://...:18088)
(spark.dynamicAllocation.executorIdleTimeout,60)
(spark.serializer,org.apache.spark.serializer.KryoSerializer)
(spark.authenticate,false)
(spark.fileserver.uri,http://...:33681)
(spark.app.name,Spark shell)
(spark.dynamicAllocation.maxExecutors,30)
(spark.dynamicAllocation.initialExecutors,3)
(spark.ui.filters,org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter)
(spark.driver.port,46781)
(spark.shuffle.service.enabled,true)
(spark.master,yarn-client)
(spark.eventLog.dir,hdfs://.../user/spark/applicationHistory)
(spark.app.id,application_1449242356422_80431)
(spark.driver.appUIAddress,http://...:4040)
(spark.driver.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native)
(spark.dynamicAllocation.schedulerBacklogTimeout,1)
(spark.shuffle.service.port,7337)
(spark.executor.id,<driver>)
(spark.jars,)
(spark.dynamicAllocation.enabled,true)
(spark.executor.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native)
(spark.yarn.am.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native)
scala>println(sc.getConf.getAll.deep.mkString(“\n”);
(spark.eventLog.enabled,true)
(火花、动态定位、地雷执行器,1)
(spark.org.apache.hadoop.warn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS,…)
(spark.repl.class.uri,http://...:54157)
(spark.tachyonStore.folderName,spark-46d43c17-b0b3-4b61-a017-a186075849ca)
(spark.org.apache.hadoop.warn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_),http://...)
(spark.driver.host,…l)
(spark.Thread.jar,本地:/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/spark/lib/spark assembly.jar)
(spark.warn.historyServer.address,http://...:18088)
(spark.DynamicLocation.executorIdleTimeout,60)
(spark.serializer,org.apache.spark.serializer.KryoSerializer)
(spark.authenticate,false)
(spark.fileserver.uri,http://...:33681)
(spark.app.name,spark shell)
(spark.DynamicLocation.maxExecutors,30)
(spark.DynamicLocation.initialExecutors,3)
(spark.ui.filters,org.apache.hadoop.warn.server.webproxy.amfilter.AmIpFilter)
(火花驱动器端口,46781)
(spark.shuffle.service.enabled,true)
(spark.master,纱线客户)
(spark.eventLog.dir,hdfs://.../user/spark/applicationHistory)
(spark.应用程序id,应用程序_1449242356422_80431)
(spark.driver.appui地址,http://...:4040)
(spark.driver.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native)
(spark.DynamicLocation.schedulerBacklogTimeout,1)
(spark.shuffle.service.port,7337)
(spark.executor.id.)
(火花罐)
(spark.DynamicLocation.enabled,true)
(spark.executor.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native)
(spark.Thread.am.extraLibraryPath,/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/hadoop/lib/native)

您可以执行以下操作:

sparkContext.getConf().getAll();

这仅获取已显式指定值的属性。它不会获取使用默认值的属性。