Java Spark临时表未以直线显示
我在AWS EMR有一个spark cluster,并尝试使用thrift server启动以下代码:Java Spark临时表未以直线显示,java,amazon-web-services,jdbc,apache-spark,amazon-emr,Java,Amazon Web Services,Jdbc,Apache Spark,Amazon Emr,我在AWS EMR有一个spark cluster,并尝试使用thrift server启动以下代码: ... JavaSparkContext jsc = new JavaSparkContext(SparkContext.getOrCreate()); HiveContext hiveContext = new HiveContext(jsc); JavaRDD<Person> people = jsc.textFile("people.txt").map( new Func
...
JavaSparkContext jsc = new JavaSparkContext(SparkContext.getOrCreate());
HiveContext hiveContext = new HiveContext(jsc);
JavaRDD<Person> people = jsc.textFile("people.txt").map(
new Function<String, Person>() {
public Person call(String line) throws Exception {
...
}
});
DataFrame schemaPeople = hiveContext.createDataFrame(people, Person.class);
schemaPeople.registerTempTable("people_temp");
schemaPeople.saveAsTable("people");
HiveThriftServer2.startWithContext(hiveContext);
...
我希望看到一个临时表
people\u temp
。为什么people\u temp
不存在?在最新的Spark 1.6.*我发现需要显式地将配置标志设置为single session以使其与临时表一起工作:Spark.sql.hive.thriftServer.singleSession=true
请参阅迁移指南
希望这有帮助
杆
+--------------+--------------+--+
| tableName | isTemporary |
+--------------+--------------+--+
| people | false |
+--------------+--------------+--+