Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 无法将数据帧保存到Phoenix_Apache Spark_Dataframe - Fatal编程技术网

Apache spark 无法将数据帧保存到Phoenix

Apache spark 无法将数据帧保存到Phoenix,apache-spark,dataframe,Apache Spark,Dataframe,当我在Spark中尝试以下代码时: jdbcDF.write.mode(SaveMode.Overwrite) .options(Map("table" -> "DC_PATIENT", "zkUrl" -> "hadoop001:2181")) .format("org.apache.phoenix.spark").save() 它引发以下异常: error: not found: value SaveMode 我丢了罐子吗jdbcDF是正确的数据帧

当我在Spark中尝试以下代码时:

jdbcDF.write.mode(SaveMode.Overwrite)
      .options(Map("table" -> "DC_PATIENT", "zkUrl" -> "hadoop001:2181"))
      .format("org.apache.phoenix.spark").save()
它引发以下异常:

 error: not found: value SaveMode
我丢了罐子吗
jdbcDF
是正确的数据帧

环境:

  • Scala 2.11.8
  • Spark 2.2.0
  • Phoenix apache-Phoenix-4.11.0-HBase-1.1

import org.apache.spark.sql.SaveMode
嘿,老兄,我试过你的,但还是不行。。我已经导入了org.apache.spark.SparkContext导入org.apache.spark.sql.\uimport org.apache.phoenix.spark.\u它显示“save不是org.apache.spark.sql.DataFrame的成员”