Scala 保存模式在Spark SQL中不起作用

Scala 保存模式在Spark SQL中不起作用,scala,apache-spark,apache-spark-sql,Scala,Apache Spark,Apache Spark Sql,我使用SaveMode选项运行sparksql示例,但出现以下错误 val df = sqlContext.read.format("json").load("/user/root/spark/data/people.json") df.select("name","age").write.format("json").save("Output",SaveMode.ErrorIfExist) <console>:35: error: overloaded method value

我使用SaveMode选项运行sparksql示例,但出现以下错误

val df = sqlContext.read.format("json").load("/user/root/spark/data/people.json")
df.select("name","age").write.format("json").save("Output",SaveMode.ErrorIfExist)


<console>:35: error: overloaded method value save with alternatives:
  ()Unit <and>
  (path: String)Unit
 cannot be applied to (String, org.apache.spark.sql.SaveMode)
              df.select("name", "age").write.format("json").save("Output",SaveMode.ErrorIfExists
val df=sqlContext.read.format(“json”).load(“/user/root/spark/data/people.json”)
df.select(“name”、“age”).write.format(“json”).save(“Output”,SaveMode.ErrorIfExist)
:35:错误:重载方法值保存为可选值:
()单位
(路径:字符串)单元
无法应用于(字符串,org.apache.spark.sql.SaveMode)
df.select(“name”、“age”).write.format(“json”).save(“Output”,SaveMode.ErrorIfExists
我查看了文档,它说SaveMode已被弃用。如何修复此问题


任何建议。

您可以使用
DataFrameWriter.mode
方法:

df.write.mode("error").save(...)


感谢您的帮助。我正在使用scala进行开发,当我使用SaveMode.ErrorIfExists时,它不工作,但模式为“error”它工作得很好。Apache Spark SQL文档说scala/java接受SaveMode.ErrorIfExists,但这似乎没有发生。有什么想法吗?@Shashi使用import org.Apache.Spark.SQL.SaveMode。然后尝试df.write.mode(SaveMode.ErrorIfExists)。save(…)
df.write.mode(SaveMode.ErrorIfExists).save(...)