Java 如何在spark中输出没有括号的值?

Java 如何在spark中输出没有括号的值?,java,scala,shell,apache-spark,Java,Scala,Shell,Apache Spark,我想将数据帧存储为纯值,但我得到的是带括号的值,代码: val df = sqlContext.read.format("orc").load(filename) //I skip the processes here, just shows as an example df.rdd.saveAsTextFile(outputPath) 数据如下: [40fc4ab12a174bf4] [5572a277df472931] [5fbce7c5c854996b] [b4283abd92ea904

我想将数据帧存储为纯值,但我得到的是带括号的值,代码:

val df = sqlContext.read.format("orc").load(filename)
//I skip the processes here, just shows as an example
df.rdd.saveAsTextFile(outputPath)
数据如下:

[40fc4ab12a174bf4]
[5572a277df472931]
[5fbce7c5c854996b]
[b4283abd92ea904]
[2f486994064f6875]
我想要的是:

40fc4ab12a174bf4
5572a277df472931
5fbce7c5c854996b
b4283abd92ea904
2f486994064f6875

使用
spark csv
写入数据:

df.write
    .format("com.databricks.spark.csv")
    .option("header", "false")
    .save(outputPath)
或者使用rdd,只需从
中获取第一个值:

df.rdd.map(l => l.get(0)).saveAsTextFile(outputPath)