Apache spark 使用Spark从配置单元表中删除
我正在使用配置单元1.2.1和Spark 1.6,问题是我无法使用Spark shell在配置单元表中执行简单的删除操作。由于蜂巢从0.14开始支持酸性物质,我希望它能被允许进入SparkApache spark 使用Spark从配置单元表中删除,apache-spark,hive,Apache Spark,Hive,我正在使用配置单元1.2.1和Spark 1.6,问题是我无法使用Spark shell在配置单元表中执行简单的删除操作。由于蜂巢从0.14开始支持酸性物质,我希望它能被允许进入Spark 16/01/19 12:44:24 INFO hive.metastore: Connected to metastore. scala> hiveContext.sql("delete from testdb.test where id=2"); 16/01/19 12:44:51 IN
16/01/19 12:44:24 INFO hive.metastore: Connected to metastore.
scala> hiveContext.sql("delete from testdb.test where id=2");
16/01/19 12:44:51 INFO parse.ParseDriver: Parsing command: delete from
testdb.test where id=2
16/01/19 12:44:52 INFO parse.ParseDriver: Parse Completed
org.apache.spark.sql.AnalysisException:
Unsupported language features in query: delete from testdb.test where id=2
TOK_DELETE_FROM 1, 0,12, 12
TOK_TABNAME 1, 4,6, 12
testdb 1, 4,4, 12
test 1, 6,6, 19
......
scala.NotImplementedError: No parse rules for TOK_DELETE_FROM:
TOK_DELETE_FROM 1, 0,12, 12
TOK_TABNAME 1, 4,6, 12
testdb 1, 4,4, 12
......
您可以从Scala内部通过命令行运行配置单元
import scala.sys.process._
val cmd = "hive -e \"delete from testdb.test where id=2\"" // Your command
val output = cmd.!! // Captures the output
另请参见您提供的命令是正确的,可以从命令行运行,但是从spark shell运行时会出现ParseException。失败:ParseException行1:3无法识别开关数据库语句java.lang中靠近“”的输入。RuntimeException:非零出口值:64