使用flume将csv文件转换为HDF,并将其转换为avro

使用flume将csv文件转换为HDF,并将其转换为avro,csv,hadoop,avro,flume,bigdata,Csv,Hadoop,Avro,Flume,Bigdata,我是大数据新手,我的任务是使用Flume将csv文件传输到HDF,但它也应该将这些csv转换为avro。我尝试使用以下水槽配置来实现这一点: a1.channels = dataChannel a1.sources = dataSource a1.sinks = dataSink a1.channels.dataChannel.type = memory a1.channels.dataChannel.capacity = 1000000 a1.channels.dataChannel.tra

我是大数据新手,我的任务是使用Flume将csv文件传输到HDF,但它也应该将这些csv转换为avro。我尝试使用以下水槽配置来实现这一点:

a1.channels = dataChannel
a1.sources = dataSource
a1.sinks = dataSink

a1.channels.dataChannel.type = memory
a1.channels.dataChannel.capacity = 1000000
a1.channels.dataChannel.transactionCapacity = 10000

a1.sources.dataSource.type = spooldir
a1.sources.dataSource.spoolDir = {spool_dir}
a1.sources.dataSource.fileHeader = true
a1.sources.dataSource.fileHeaderKey = file
a1.sources.dataSource.basenameHeader = true
a1.sources.dataSource.basenameHeaderKey = basename
a1.sources.dataSource.interceptors.attach-schema.type = static
a1.sources.dataSource.interceptors.attach-schema.key = flume.avro.schema.url
a1.sources.dataSource.interceptors.attach-schema.value = {path_to_schema_in_hdfs}

a1.sinks.dataSink.type = hdfs
a1.sinks.dataSink.hdfs.path = {sink_path}
a1.sinks.dataSink.hdfs.format = text
a1.sinks.dataSink.hdfs.inUsePrefix = .
a1.sinks.dataSink.hdfs.filePrefix = drone
a1.sinks.dataSink.hdfs.fileSuffix = .avro
a1.sinks.dataSink.hdfs.rollSize = 180000000
a1.sinks.dataSink.hdfs.rollCount = 100000
a1.sinks.dataSink.hdfs.rollInterval = 120
a1.sinks.dataSink.hdfs.idleTimeout = 3600
a1.sinks.dataSink.hdfs.fileType = DataStream
a1.sinks.dataSink.serializer = avro_event
avro文件与flume的默认模式的输出。我也尝试使用AvroEventSerializer,但我只是得到了很多不同的错误,我解决了所有问题,除了这个:

ERROR hdfs.HDFSEventSink: process failed
java.lang.ExceptionInInitializerError
        at org.apache.hadoop.hdfs.DFSOutputStream.computePacketChunkSize(DFSOutputStream.java:1305)
        at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1243)
        at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1266)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1101)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1059)
        at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:232)
        at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:75)

感谢您的帮助。

很抱歉配置中出现错误。我修复了它们,找到了将css转换为avro的方法。我对AvroEventSerializer进行了一点修改,如下所示:

public void write(Event event) throws IOException {
        if (dataFileWriter == null) {
            initialize(event);
        }
        String[] items = new String(event.getBody()).split(",");
        city.put("deviceID", Long.parseLong(items[0]));
        city.put("groupID", Long.parseLong(items[1]));
        city.put("timeCounter", Long.parseLong(items[2]));
        city.put("cityCityName", items[3]);
        city.put("cityStateCode", items[4]);
        city.put("sessionCount", Long.parseLong(items[5]));
        city.put("errorCount", Long.parseLong(items[6]));
        dataFileWriter.append(citi);
    }
这里是城市的定义:

private GenericRecord city = null;
如果你知道更好的方法,请回复