Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java AVRO使用Hadoop添加记录压缩级别_Java_Hadoop_Compression_Avro - Fatal编程技术网

Java AVRO使用Hadoop添加记录压缩级别

Java AVRO使用Hadoop添加记录压缩级别,java,hadoop,compression,avro,Java,Hadoop,Compression,Avro,我有以下使用提供的模式将json转换为avro的工作代码: ByteArrayOutputStream outputStream = new ByteArrayOutputStream(); BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(outputStream, null); DatumWriter<Object> writer = new GenericDatumWriter(schema);

我有以下使用提供的模式将json转换为avro的工作代码:

 ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
 BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(outputStream, null);
 DatumWriter<Object> writer = new GenericDatumWriter(schema);
           
 writer.write(jsonAsBinaryData, encoder);

 encoder.flush();

 return outputStream.toByteArray();
它工作正常,尽管我仍然希望得到任何反馈

 ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
 DefaultCodec codec = ReflectionUtils.newInstance(DefaultCodec.class, new org.apache.hadoop.conf.Configuration());
 OutputStream compressedOutputStream = codec.createOutputStream(outputStream);
 BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(compressedOutputStream, null);
 DatumWriter<Object> writer = new GenericDatumWriter<>(schema);
           
 writer.write(jsonAsBinaryData, encoder);

 encoder.flush();

 return outputStream.toByteArray();
DeflateCodec codec = new DeflateCodec(5);
var out = codec.compress(ByteBuffer.wrap(outputStream.toByteArray()));