Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/loops/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在Map/Reduce中读取CSV文件?_Csv_Hadoop_Mapreduce - Fatal编程技术网

如何在Map/Reduce中读取CSV文件?

如何在Map/Reduce中读取CSV文件?,csv,hadoop,mapreduce,Csv,Hadoop,Mapreduce,我有一个大的CSV文件,大小为6GB,以逗号分隔。下面是mapper函数 @Override public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { String[] tokens = value.toString().split(","); String crimeType = tokens[5].

我有一个大的CSV文件,大小为6GB,以逗号分隔。下面是mapper函数

@Override
    public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
        String[] tokens = value.toString().split(",");

        String crimeType = tokens[5].trim();      // column #5 is the crime type in the CSV file, serving key
//      int year = Integer.parseInt(tokens[17].trim()); // the year when the crime happened

        int year = 2010;

        CrimeTypeKey crimeTypeYearKey = new CrimeTypeKey(crimeType, year);

        context.write(crimeTypeYearKey, ONE);
}

如您所见,我使用“.split”来分解每一行(或列?)。我想知道在这种情况下如何使用OpenCSV?请给我举个例子,非常感谢

以一种有效的方式,可能不会。您想使用OpenCSV有什么特别的原因吗?