Scala Spark:压缩并保存到文本文件时出错
我有一份scala Spark的工作。我想使用Gzip压缩输出,然后保存totextfileScala Spark:压缩并保存到文本文件时出错,scala,apache-spark,Scala,Apache Spark,我有一份scala Spark的工作。我想使用Gzip压缩输出,然后保存totextfile compressedEvents.saveAsTextFile(outputDirectory, org.apache.hadoop.io.compress.GzipCodec) 但我得到了以下错误: [error] /var/lib/jenkins/workspace/producer-data-test/producer-data-test-build/src/main/scala/IpFromL
compressedEvents.saveAsTextFile(outputDirectory, org.apache.hadoop.io.compress.GzipCodec)
但我得到了以下错误:
[error] /var/lib/jenkins/workspace/producer-data-test/producer-data-test-build/src/main/scala/IpFromLogs.scala:46: object org.apache.hadoop.io.compress.GzipCodec is not a value
[error] compressedEvents.saveAsTextFile(outputDirectory, org.apache.hadoop.io.compress.GzipCodec)
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
我尝试了同一个版本的不同变体,但都不起作用。请帮忙 正确的保存方法是
compressedEvents.saveAsTextFile(outputDirectory, classOf[GzipCodec])
或
在保存之前,请将配置设置为
sc.hadoopConfiguration.setClass(FileOutputFormat.COMPRESS_CODEC, classOf[GzipCodec], classOf[CompressionCodec])
并将其另存为
compressedEvents.saveAsTextFile(outputDirectory)