Hadoop 字段x中的字符串为Null

Hadoop 字段x中的字符串为Null,hadoop,apache-pig,avro,Hadoop,Apache Pig,Avro,当我试图将数据写入Avro格式时,我的pig作业突然开始失败,出现了这个错误。分析我想要写入的数据,它们都有一个填充的字段x,当我在脚本中添加“filter by x is not null”子句时,我得到了相同的错误 有人知道这是什么原因吗 java.io.IOException: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.NullPointerException: null of string in

当我试图将数据写入Avro格式时,我的pig作业突然开始失败,出现了这个错误。分析我想要写入的数据,它们都有一个填充的字段x,当我在脚本中添加“filter by x is not null”子句时,我得到了相同的错误

有人知道这是什么原因吗

java.io.IOException: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.NullPointerException: null of string in field x at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.runPipeline(PigGenericMapReduce.java:470) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.processOnePackageOutput(PigGenericMapReduce.java:433) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.reduce(PigGenericMapReduce.java:413) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.reduce(PigGenericMapReduce.java:257) at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:177) at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:649) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:418) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1199) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caused by: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.NullPointerException: null of string in field itemId of info of info of array of array in field articleBag of articles of articles at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:263) at org.apache.pig.piggybank.storage.avro.PigAvroRecordWriter.write(PigAvroRecordWriter.java:49) at org.apache.pig.piggybank.storage.avro.AvroStorage.putNext(AvroStorage.java:638) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139) at 

展示你的猪scripts@user1436111嘿,我也有同样的错误,你能解决吗?