Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 使用自定义序列化程序从avro读取时,RDD中的运行时类型错误_Java_Apache Spark_Avro_Kryo - Fatal编程技术网

Java 使用自定义序列化程序从avro读取时,RDD中的运行时类型错误

Java 使用自定义序列化程序从avro读取时,RDD中的运行时类型错误,java,apache-spark,avro,kryo,Java,Apache Spark,Avro,Kryo,我正在尝试使用Kryo将avro文件中的数据读取到RDD中。我的代码编译得很好,但在运行时我得到了一个ClassCastException。以下是我的代码的作用: SparkConf conf = new SparkConf()... conf.set("spark.serializer", KryoSerializer.class.getCanonicalName()); conf.set("spark.kryo.registrator", MyKryoRegistrator.class.ge

我正在尝试使用Kryo将avro文件中的数据读取到RDD中。我的代码编译得很好,但在运行时我得到了一个
ClassCastException
。以下是我的代码的作用:

SparkConf conf = new SparkConf()...
conf.set("spark.serializer", KryoSerializer.class.getCanonicalName());
conf.set("spark.kryo.registrator", MyKryoRegistrator.class.getName());
JavaSparkContext sc = new JavaSparkContext(conf);
其中
mykryoregistor
MyCustomClass
注册序列化程序:

public void registerClasses(Kryo kryo) {
    kryo.register(MyCustomClass.class, new MyCustomClassSerializer());
}
然后,我读取我的数据文件:

JavaPairRDD<MyCustomClass, NullWritable> records =
                sc.newAPIHadoopFile("file:/path/to/datafile.avro",
                AvroKeyInputFormat.class, MyCustomClass.class, NullWritable.class,
                sc.hadoopConfiguration());
Tuple2<MyCustomClass, NullWritable> first = records.first();
我得到一个例外:

java.lang.ClassCastException: org.apache.avro.mapred.AvroKey cannot be cast to my.package.containing.MyCustomClass

我做错什么了吗?即便如此,我不应该得到编译错误而不是运行时错误吗?

*************EDIT**************

我设法从avro文件加载自定义对象,并用代码创建了一个自定义对象。但是,如果avro库无法将数据加载到自定义类中,它将返回GenericData$Record对象。在这种情况下,Spark Java API不会检查对自定义类的赋值,这就是为什么在尝试访问AvroKey的数据时,只会得到ClassCastException。这违反了数据安全保证


*************编辑**************

对于其他试图这样做的人,我有一个解决这个问题的方法,但这不是正确的解决方案: 我创建了一个类来读取
GenericData.Record
来自avro文件:

public class GenericRecordFileInputFormat extends FileInputFormat<GenericData.Record, NullWritable> {
    private static final Logger LOG = LoggerFactory.getLogger(GenericRecordFileInputFormat.class);

    /**
     * {@inheritDoc}
     */
    @Override
    public RecordReader<GenericData.Record, NullWritable> createRecordReader(
            InputSplit split, TaskAttemptContext context) throws IOException, InterruptedException {
        Schema readerSchema = AvroJob.getInputKeySchema(context.getConfiguration());
        if (null == readerSchema) {
            LOG.warn("Reader schema was not set. Use AvroJob.setInputKeySchema() if desired.");
            LOG.info("Using a reader schema equal to the writer schema.");
        }
        return new GenericDataRecordReader(readerSchema);
    }


    public static class GenericDataRecordReader extends RecordReader<GenericData.Record, NullWritable> {

        AvroKeyRecordReader<GenericData.Record> avroReader;

        public GenericDataRecordReader(Schema readerSchema) {
            super();
            avroReader = new AvroKeyRecordReader<>(readerSchema);
        }

        @Override
        public void initialize(InputSplit inputSplit, TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException {
            avroReader.initialize(inputSplit, taskAttemptContext);
        }

        @Override
        public boolean nextKeyValue() throws IOException, InterruptedException {
            return avroReader.nextKeyValue();
        }

        @Override
        public GenericData.Record getCurrentKey() throws IOException, InterruptedException {
            AvroKey<GenericData.Record> currentKey = avroReader.getCurrentKey();
            return currentKey.datum();
        }

        @Override
        public NullWritable getCurrentValue() throws IOException, InterruptedException {
            return avroReader.getCurrentValue();
        }

        @Override
        public float getProgress() throws IOException, InterruptedException {
            return avroReader.getProgress();
        }

        @Override
        public void close() throws IOException {
            avroReader.close();
        }
    }
}
公共类GenericRecordFileInputFormat扩展FileInputFormat{
私有静态最终记录器LOG=LoggerFactory.getLogger(GenericRecordFileInputFormat.class);
/**
*{@inheritardoc}
*/
@凌驾
公共记录阅读器createRecordReader(
InputSplit拆分,TaskAttemptContext上下文)引发IOException,InterruptedException{
Schema readerSchema=AvroJob.getInputKeySchema(context.getConfiguration());
if(null==readerSchema){
警告(“未设置读取器架构。如果需要,请使用AvroJob.setInputKeySchema());
info(“使用与writer模式相同的reader模式”);
}
返回新的GenericDataRecordReader(readerSchema);
}
公共静态类GenericDataRecordReader扩展了RecordReader{
AvroKeyRecordReader avroReader;
公共GenericDataRecordReader(模式读取器Schema){
超级();
avroReader=新的AvroKeyRecordReader(readerSchema);
}
@凌驾
public void initialize(InputSplit InputSplit,TaskAttemptContext TaskAttemptContext)引发IOException、InterruptedException{
初始化(inputSplit,taskAttemptContext);
}
@凌驾
公共布尔值nextKeyValue()引发IOException、InterruptedException{
返回avroReader.nextKeyValue();
}
@凌驾
public GenericData.Record getCurrentKey()引发IOException、InterruptedException{
AvroKey currentKey=avroReader.getCurrentKey();
返回currentKey.datum();
}
@凌驾
public NullWritable getCurrentValue()引发IOException、InterruptedException{
返回avroReader.getCurrentValue();
}
@凌驾
public float getProgress()引发IOException、InterruptedException{
返回avroReader.getProgress();
}
@凌驾
public void close()引发IOException{
avroReader.close();
}
}
}
然后我加载记录:

JavaRDD<GenericData.Record> records = sc.newAPIHadoopFile("file:/path/to/datafile.avro",
                GenericRecordFileInputFormat.class, GenericData.Record.class, NullWritable.class,
                sc.hadoopConfiguration()).keys();
JavaRDD records=sc.newAPIHadoopFile(“文件:/path/to/datafile.avro”,
GenericRecordFileInputFormat.class、GenericData.Record.class、NullWritable.class、,
sc.hadoopConfiguration()).keys();
然后,我使用接受
GenericData.Record
的构造函数将记录转换为自定义类


同样-不漂亮,但很有效。

************编辑**************

我设法从avro文件加载自定义对象,并用代码创建了一个自定义对象。但是,如果avro库无法将数据加载到自定义类中,它将返回GenericData$Record对象。在这种情况下,Spark Java API不会检查对自定义类的赋值,这就是为什么在尝试访问AvroKey的数据时,只会得到ClassCastException。这违反了数据安全保证


*************编辑**************

对于其他试图这样做的人,我有一个解决这个问题的方法,但这不是正确的解决方案: 我创建了一个类来读取
GenericData.Record
来自avro文件:

public class GenericRecordFileInputFormat extends FileInputFormat<GenericData.Record, NullWritable> {
    private static final Logger LOG = LoggerFactory.getLogger(GenericRecordFileInputFormat.class);

    /**
     * {@inheritDoc}
     */
    @Override
    public RecordReader<GenericData.Record, NullWritable> createRecordReader(
            InputSplit split, TaskAttemptContext context) throws IOException, InterruptedException {
        Schema readerSchema = AvroJob.getInputKeySchema(context.getConfiguration());
        if (null == readerSchema) {
            LOG.warn("Reader schema was not set. Use AvroJob.setInputKeySchema() if desired.");
            LOG.info("Using a reader schema equal to the writer schema.");
        }
        return new GenericDataRecordReader(readerSchema);
    }


    public static class GenericDataRecordReader extends RecordReader<GenericData.Record, NullWritable> {

        AvroKeyRecordReader<GenericData.Record> avroReader;

        public GenericDataRecordReader(Schema readerSchema) {
            super();
            avroReader = new AvroKeyRecordReader<>(readerSchema);
        }

        @Override
        public void initialize(InputSplit inputSplit, TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException {
            avroReader.initialize(inputSplit, taskAttemptContext);
        }

        @Override
        public boolean nextKeyValue() throws IOException, InterruptedException {
            return avroReader.nextKeyValue();
        }

        @Override
        public GenericData.Record getCurrentKey() throws IOException, InterruptedException {
            AvroKey<GenericData.Record> currentKey = avroReader.getCurrentKey();
            return currentKey.datum();
        }

        @Override
        public NullWritable getCurrentValue() throws IOException, InterruptedException {
            return avroReader.getCurrentValue();
        }

        @Override
        public float getProgress() throws IOException, InterruptedException {
            return avroReader.getProgress();
        }

        @Override
        public void close() throws IOException {
            avroReader.close();
        }
    }
}
公共类GenericRecordFileInputFormat扩展FileInputFormat{
私有静态最终记录器LOG=LoggerFactory.getLogger(GenericRecordFileInputFormat.class);
/**
*{@inheritardoc}
*/
@凌驾
公共记录阅读器createRecordReader(
InputSplit拆分,TaskAttemptContext上下文)引发IOException,InterruptedException{
Schema readerSchema=AvroJob.getInputKeySchema(context.getConfiguration());
if(null==readerSchema){
警告(“未设置读取器架构。如果需要,请使用AvroJob.setInputKeySchema());
info(“使用与writer模式相同的reader模式”);
}
返回新的GenericDataRecordReader(readerSchema);
}
公共静态类GenericDataRecordReader扩展了RecordReader{
AvroKeyRecordReader avroReader;
公共GenericDataRecordReader(模式读取器Schema){
超级();
avroReader=新的AvroKeyRecordReader(readerSchema);
}
@凌驾
public void initialize(InputSplit InputSplit,TaskAttemptContext TaskAttemptContext)引发IOException、InterruptedException{
初始化(inputSplit,taskAttemptContext);
}
@凌驾
公共布尔值nextKeyValue()引发IOException、InterruptedException{
返回avroReader.nextKeyValue();
}
@凌驾
public GenericData.Record getCurrentKey()引发IOException、InterruptedException{
AvroKey currentKey=avroReader.getCurrentKey();