Apache spark 我想将我的spark数据集写入phoenix表,有人能帮我吗?
我已经编写了以下代码,但它不起作用Apache spark 我想将我的spark数据集写入phoenix表,有人能帮我吗?,apache-spark,phoenix,Apache Spark,Phoenix,我已经编写了以下代码,但它不起作用 df.write().format("org.apache.phoenix.spark").mode(SaveMode.Overwrite).options(ImmutableMap.of("zkUrl", clientProp.getProperty("zookeeper.url"), "table", "mdr_rec.kafka_offsets")).save(); 我遇到以下错误: at org.apache.spark.sql.execut
df.write().format("org.apache.phoenix.spark").mode(SaveMode.Overwrite).options(ImmutableMap.of("zkUrl", clientProp.getProperty("zookeeper.url"), "table", "mdr_rec.kafka_offsets")).save();
我遇到以下错误:
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
你能包括整个堆栈跟踪,这将是非常有用的吗