Scala错误:类型参数

Scala错误:类型参数,scala,apache-spark,Scala,Apache Spark,我想使用HIPI在spark上处理图像,所以我使用hadoopfile创建RDD val conf = new SparkConf().setAppName("BundleTest") val sc = new SparkContext(conf) val bundle0 = sc.hadoopFile[HipiImageHeader,FloatImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",

我想使用HIPI在spark上处理图像,所以我使用hadoopfile创建RDD

val conf = new SparkConf().setAppName("BundleTest")
val sc = new SparkContext(conf)
val bundle0 = sc.hadoopFile[HipiImageHeader,FloatImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
但我有一个错误

Error:(39, 22) type arguments [org.hipi.image.HipiImageHeader,org.hipi.image.FloatImage,org.hipi.imagebundle.mapreduce.HibInputFormat] conform to the bounds of none of the overloaded alternatives of value hadoopFile: [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)] <and> [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String, minPartitions: Int)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)]
val bundle0 = sc.hadoopFile[HipiImageHeader,FloatImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib",1000)
                 ^

错误:(39,22)类型参数[org.hipi.image.HipiImageHeader,org.hipi.image.FloatImage,org.hipi.imagebundle.mapreduce.hibiinputformat]不符合值hadoopFile:[K,V,F扩展
FileInputFormat[HipiImageHeader,HipiImage]
,而不是
FileInputFormat][HipiImageHeader,FloatImage]
。因此,
Hadoop文件[HipiImageHeader,HipiImage,HibInputFormat]
应该可以工作。

谢谢你的帮助,我已经尝试过了。但它不起作用。