Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Spark Executor:无效的初始堆大小:-Xms0M_Java_Apache Spark_Jvm_Apache Spark Sql - Fatal编程技术网

Java Spark Executor:无效的初始堆大小:-Xms0M

Java Spark Executor:无效的初始堆大小:-Xms0M,java,apache-spark,jvm,apache-spark-sql,Java,Apache Spark,Jvm,Apache Spark Sql,我已将Spark配置为在配置单元表上查询 使用以下命令运行Thrift JDBC/ODBC服务器: cd $SPARK_HOME ./sbin/start-thriftserver.sh --master spark://myhost:7077 --hiveconf hive.server2.thrift.bind.host=myhost --hiveconf hive.server2.thrift.port=9999 然后在Spark worker UI上检查,执行器启动失败,出现以下错误,

我已将Spark配置为在配置单元表上查询

使用以下命令运行Thrift JDBC/ODBC服务器:

cd $SPARK_HOME
./sbin/start-thriftserver.sh --master spark://myhost:7077 --hiveconf hive.server2.thrift.bind.host=myhost --hiveconf hive.server2.thrift.port=9999
然后在Spark worker UI上检查,执行器启动失败,出现以下错误,JVM初始化失败,原因是错误-Xms:

Invalid initial heap size: -Xms0M
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
以下是conf/spark env.sh

export SPARK_JAVA_OPTS="-Dspark.executor.memory=512M"
export SPARK_EXECUTOR_MEMORY=1G
export SPARK_DRIVER_MEMORY=512M
export SPARK_WORKER_MEMORY=2G
export SPARK_WORKER_INSTANCES=1
exec "$FWDIR"/sbin/spark-daemon.sh spark-submit $CLASS 1 --executor-memory 512M "$@"
我真的不知道这个值-Xms0M是从哪里来的,或者它是如何推导出来的? 请帮助我了解问题并更改此值。

它正在工作

Thrift server未从spark env.sh中选取执行器内存​ , 然后我明确地添加了thrift服务器启动脚本

/sbin/start-thriftserver.sh

export SPARK_JAVA_OPTS="-Dspark.executor.memory=512M"
export SPARK_EXECUTOR_MEMORY=1G
export SPARK_DRIVER_MEMORY=512M
export SPARK_WORKER_MEMORY=2G
export SPARK_WORKER_INSTANCES=1
exec "$FWDIR"/sbin/spark-daemon.sh spark-submit $CLASS 1 --executor-memory 512M "$@"
这样,Executor就开始获得有效内存,JDBC查询也会得到结果

conf/spark env.sh​ (thrift服务器未拾取执行器内存配置)