Apache pig Avro:java.lang.RuntimeException:记录中不支持的类型

Apache pig Avro:java.lang.RuntimeException:记录中不支持的类型,apache-pig,runtimeexception,avro,Apache Pig,Runtimeexception,Avro,输入:test.csv 100 101 102 清管器脚本: REGISTER required jars are registered; A = LOAD 'test.csv' USING org.apache.pig.piggybank.storage.CSVExcelStorage() AS (code:chararray); STORE A INTO 'test' USING org.apache.pig.piggybank.storage.avro.AvroStorage

输入:test.csv

100
101
102
清管器脚本:

REGISTER required jars are registered;

A = LOAD 'test.csv'  USING org.apache.pig.piggybank.storage.CSVExcelStorage() AS (code:chararray);

STORE A INTO 'test' USING org.apache.pig.piggybank.storage.avro.AvroStorage
    ('schema',
    '{"namespace":"com.pig.test.avro","type":"record","name":"Avro_Test","doc":"Avro Test Schema",
        "fields":[
            {"name":"code","type":["string","null"],"default":null}
            ]}'
    );
ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backed error: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.RuntimeException: Unsupported type in record:class java.lang.String
at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:263)
at org.apache.pig.piggybank.storage.avro.PigAvroRecordWriter.write(PigAvroRecordWriter.java:49)
at org.apache.pig.piggybank.storage.avro.AvroStorage.putNext(AvroStorage.java:722)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:98)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:558)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:106)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map.collect(PigMap
2015-06-02 23:06:03,934 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!  
2015-06-02 23:06:03,934 [main] INFO  org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics: 
存储时获取运行时错误。任何关于解决相同问题的输入

错误日志:

REGISTER required jars are registered;

A = LOAD 'test.csv'  USING org.apache.pig.piggybank.storage.CSVExcelStorage() AS (code:chararray);

STORE A INTO 'test' USING org.apache.pig.piggybank.storage.avro.AvroStorage
    ('schema',
    '{"namespace":"com.pig.test.avro","type":"record","name":"Avro_Test","doc":"Avro Test Schema",
        "fields":[
            {"name":"code","type":["string","null"],"default":null}
            ]}'
    );
ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backed error: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.RuntimeException: Unsupported type in record:class java.lang.String
at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:263)
at org.apache.pig.piggybank.storage.avro.PigAvroRecordWriter.write(PigAvroRecordWriter.java:49)
at org.apache.pig.piggybank.storage.avro.AvroStorage.putNext(AvroStorage.java:722)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:98)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:558)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:106)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map.collect(PigMap
2015-06-02 23:06:03,934 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!  
2015-06-02 23:06:03,934 [main] INFO  org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics: 

看起来这是一个bug:


如果可以,请尝试更新到pig 0.14,根据评论,此问题已得到解决。

感谢您查看此问题。根据建议的注释,尝试在输入中再引入一个字段(而不是单个字段)。随着这一变化,我们最终没有看到这个问题。