Java AVRO使用Hadoop添加记录压缩级别
我有以下使用提供的模式将json转换为avro的工作代码:Java AVRO使用Hadoop添加记录压缩级别,java,hadoop,compression,avro,Java,Hadoop,Compression,Avro,我有以下使用提供的模式将json转换为avro的工作代码: ByteArrayOutputStream outputStream = new ByteArrayOutputStream(); BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(outputStream, null); DatumWriter<Object> writer = new GenericDatumWriter(schema);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(outputStream, null);
DatumWriter<Object> writer = new GenericDatumWriter(schema);
writer.write(jsonAsBinaryData, encoder);
encoder.flush();
return outputStream.toByteArray();
它工作正常,尽管我仍然希望得到任何反馈
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
DefaultCodec codec = ReflectionUtils.newInstance(DefaultCodec.class, new org.apache.hadoop.conf.Configuration());
OutputStream compressedOutputStream = codec.createOutputStream(outputStream);
BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(compressedOutputStream, null);
DatumWriter<Object> writer = new GenericDatumWriter<>(schema);
writer.write(jsonAsBinaryData, encoder);
encoder.flush();
return outputStream.toByteArray();
DeflateCodec codec = new DeflateCodec(5);
var out = codec.compress(ByteBuffer.wrap(outputStream.toByteArray()));