Apache spark 无法将数据帧保存到Phoenix
当我在Spark中尝试以下代码时:Apache spark 无法将数据帧保存到Phoenix,apache-spark,dataframe,Apache Spark,Dataframe,当我在Spark中尝试以下代码时: jdbcDF.write.mode(SaveMode.Overwrite) .options(Map("table" -> "DC_PATIENT", "zkUrl" -> "hadoop001:2181")) .format("org.apache.phoenix.spark").save() 它引发以下异常: error: not found: value SaveMode 我丢了罐子吗jdbcDF是正确的数据帧
jdbcDF.write.mode(SaveMode.Overwrite)
.options(Map("table" -> "DC_PATIENT", "zkUrl" -> "hadoop001:2181"))
.format("org.apache.phoenix.spark").save()
它引发以下异常:
error: not found: value SaveMode
我丢了罐子吗jdbcDF
是正确的数据帧
环境:
- Scala 2.11.8
- Spark 2.2.0
- Phoenix apache-Phoenix-4.11.0-HBase-1.1
import org.apache.spark.sql.SaveMode
嘿,老兄,我试过你的,但还是不行。。我已经导入了org.apache.spark.SparkContext导入org.apache.spark.sql.\uimport org.apache.phoenix.spark.\u它显示“save不是org.apache.spark.sql.DataFrame的成员”