Scala ';新的HiveContext';想要一个X11显示器吗?com.trend.iwss.jscan?

Scala ';新的HiveContext';想要一个X11显示器吗?com.trend.iwss.jscan?,scala,hadoop,apache-spark,hive,apache-spark-sql,Scala,Hadoop,Apache Spark,Hive,Apache Spark Sql,Spark 1.6.2(纱线母版) 包名称:com.example.spark.Main 基本SparkSQL代码 val conf = new SparkConf() conf.setAppName("SparkSQL w/ Hive") val sc = new SparkContext(conf) val hiveContext = new HiveContext(sc) import hiveContext.implicits._ // val rdd = <some RDD

Spark 1.6.2(纱线母版)

包名称:com.example.spark.Main


基本SparkSQL代码

val conf = new SparkConf()
conf.setAppName("SparkSQL w/ Hive")
val sc = new SparkContext(conf)

val hiveContext = new HiveContext(sc)
import hiveContext.implicits._

// val rdd = <some RDD making>
val df = rdd.toDF()
df.write.saveAsTable("example")
val conf=new SparkConf()
conf.setAppName(“带配置单元的SparkSQL”)
val sc=新的SparkContext(配置)
val hiveContext=新hiveContext(sc)
导入hiveContext.implicits_
//val rdd=
val df=rdd.toDF()
df.write.saveAsTable(“示例”)
还有stacktrace

No X11 DISPLAY variable was set, but this program performed an operation which requires it.
         at java.awt.GraphicsEnvironment.checkHeadless(GraphicsEnvironment.java:204)
         at java.awt.Window.<init>(Window.java:536)
         at java.awt.Frame.<init>(Frame.java:420)
         at java.awt.Frame.<init>(Frame.java:385)
         at com.trend.iwss.jscan.runtime.BaseDialog.getActiveFrame(BaseDialog.java:75)
         at com.trend.iwss.jscan.runtime.AllowDialog.make(AllowDialog.java:32)
         at com.trend.iwss.jscan.runtime.PolicyRuntime.showAllowDialog(PolicyRuntime.java:325)
         at com.trend.iwss.jscan.runtime.PolicyRuntime.stopActionInner(PolicyRuntime.java:240)
         at com.trend.iwss.jscan.runtime.PolicyRuntime.stopAction(PolicyRuntime.java:172)
         at com.trend.iwss.jscan.runtime.PolicyRuntime.stopAction(PolicyRuntime.java:165)
         at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.checkURL(NetworkPolicyRuntime.java:284)
         at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime._preFilter(NetworkPolicyRuntime.java:164)
         at com.trend.iwss.jscan.runtime.PolicyRuntime.preFilter(PolicyRuntime.java:132)
         at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.preFilter(NetworkPolicyRuntime.java:108)
         at org.apache.commons.logging.LogFactory$5.run(LogFactory.java:1346)
         at java.security.AccessController.doPrivileged(Native Method)
         at org.apache.commons.logging.LogFactory.getProperties(LogFactory.java:1376)
         at org.apache.commons.logging.LogFactory.getConfigurationFile(LogFactory.java:1412)
         at org.apache.commons.logging.LogFactory.getFactory(LogFactory.java:455)
         at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
         at org.apache.hadoop.hive.shims.HadoopShimsSecure.<clinit>(HadoopShimsSecure.java:60)
         at java.lang.Class.forName0(Native Method)
         at java.lang.Class.forName(Class.java:264)
         at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:146)
         at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:141)
         at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)
         at org.apache.spark.sql.hive.client.ClientWrapper.overrideHadoopShims(ClientWrapper.scala:116)
         at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:69)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
         at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
         at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:345)
         at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:255)
         at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:459)
         at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:233)
         at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:236)
         at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
         at com.example.spark.Main1$.main(Main.scala:52)
         at com.example.spark.Main.main(Main.scala)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:498)
         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
         at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 ivysettings.xml file not found in HIVE_HOME or HIVE_CONF_DIR,/etc/hive/2.5.3.0-37/0/ivysettings.xml will be used
未设置X11显示变量,但该程序执行了需要该变量的操作。
位于java.awt.GraphicsEnvironment.checkHeadless(GraphicsEnvironment.java:204)
在java.awt.Window(Window.java:536)
在java.awt.Frame.(Frame.java:420)
在java.awt.Frame.(Frame.java:385)
位于com.trend.iwss.jscan.runtime.BaseDialog.getActiveFrame(BaseDialog.java:75)
位于com.trend.iwss.jscan.runtime.AllowDialog.make(AllowDialog.java:32)
位于com.trend.iwss.jscan.runtime.PolicyRuntime.showAllowDialog(PolicyRuntime.java:325)
位于com.trend.iwss.jscan.runtime.PolicyRuntime.stopActionInner(PolicyRuntime.java:240)
在com.trend.iwss.jscan.runtime.PolicyRuntime.stopAction(PolicyRuntime.java:172)上
位于com.trend.iwss.jscan.runtime.PolicyRuntime.stopAction(PolicyRuntime.java:165)
位于com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.checkURL(NetworkPolicyRuntime.java:284)
在com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.\u preFilter(NetworkPolicyRuntime.java:164)
在com.trend.iwss.jscan.runtime.PolicyRuntime.preFilter(PolicyRuntime.java:132)上
位于com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.preFilter(NetworkPolicyRuntime.java:108)
位于org.apache.commons.logging.LogFactory$5.run(LogFactory.java:1346)
位于java.security.AccessController.doPrivileged(本机方法)
位于org.apache.commons.logging.LogFactory.getProperties(LogFactory.java:1376)
位于org.apache.commons.logging.LogFactory.getConfigurationFile(LogFactory.java:1412)
位于org.apache.commons.logging.LogFactory.getFactory(LogFactory.java:455)
位于org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
在org.apache.hadoop.hive.shimmes.HadoopShimsSecure.(HadoopShimsSecure.java:60)
位于java.lang.Class.forName0(本机方法)
位于java.lang.Class.forName(Class.java:264)
位于org.apache.hadoop.hive.shimmers.ShimLoader.createShim(ShimLoader.java:146)
位于org.apache.hadoop.hive.shimmers.ShimLoader.loadshimmers(ShimLoader.java:141)
位于org.apache.hadoop.hive.shimmers.ShimLoader.gethadoopshimmers(ShimLoader.java:100)
位于org.apache.spark.sql.hive.client.ClientWrapper.OverrideHadoopFilms(ClientWrapper.scala:116)
位于org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:69)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:423)
位于org.apache.spark.sql.hive.client.IsolatedClient.createClient(IsolatedClient.scala:249)
位于org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:345)
位于org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:255)
位于org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:459)
位于org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:233)
位于org.apache.spark.sql.hive.HiveContext(HiveContext.scala:236)
位于org.apache.spark.sql.hive.HiveContext(HiveContext.scala:101)
在com.example.spark.Main1$.main上(main.scala:52)
在com.example.spark.Main.Main(Main.scala)上
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
位于org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
位于org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
位于org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
位于org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
位于org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
在HIVE_HOME或HIVE_CONF_DIR中找不到ivysettings.xml文件,将使用/etc/HIVE/2.5.3.0-37/0/ivysettings.xml

这段代码一周前在一个新的HDP集群上运行,在沙箱中运行良好。。。我记得我所做的唯一一件事就是试图围绕JAVA_HOME变量进行更改,但我相当肯定我取消了这些更改。
我不知所措——不知道如何开始追踪这个问题

集群是无头的,因此它当然没有X11显示,但是哪一块
新的HiveContext
甚至需要弹出任何
JFrame

根据日志,我会说这是一个我搞糟的Java配置问题,
org.apache.hadoop.hive.shims.HadoopShimsSecure.(HadoopShimsSecure.Java:60)
被触发,因此出现了一个Java安全对话框,但我不知道

无法执行X11转发,并尝试在
SPARK提交之前执行
导出SPARK_OPTS=“-Djava.awt.headless=true”
,但没有帮助

尝试了这些,但是再次,无法前进并且没有显示

两个Spark客户端上的错误似乎可以重现。
我只在一台机器上尝试更改
JAVA\u HOME

做了安巴里蜂巢服务检查。没有修复它。
Can c
val conf = SparkConf()
conf.set("spark.executor.extraJavaOptions" , "-Djava.awt.headless=true")