Java 无法将获取org.apache.hadoop.hive.ql.exec.vector.DecimalColumnVector转换为org.apache.hadoop.hive.ql.exec.vector.LongColumnVector
我正在尝试执行上面的代码片段Java 无法将获取org.apache.hadoop.hive.ql.exec.vector.DecimalColumnVector转换为org.apache.hadoop.hive.ql.exec.vector.LongColumnVector,java,hadoop,hive,orc,Java,Hadoop,Hive,Orc,我正在尝试执行上面的代码片段lineArray是一个字符串数组。但是它失败了,java.lang.ClassCastException:org.apache.hadoop.hive.ql.exec.vector.DecimalColumnVector无法转换为org.apache.hadoop.hive.ql.exec.vector.LongColumnVector。请帮忙 // column: bigint if (getTypes(column, struct).equa
lineArray
是一个字符串数组。但是它失败了,java.lang.ClassCastException:org.apache.hadoop.hive.ql.exec.vector.DecimalColumnVector无法转换为org.apache.hadoop.hive.ql.exec.vector.LongColumnVector。请帮忙
// column: bigint
if (getTypes(column, struct).equalsIgnoreCase("bigint")) {
if (!lineArray[column].isEmpty()) {
try {
((LongColumnVector) batch.cols[column]).vector[row] = Long.parseLong(lineArray[column]);
} catch (NumberFormatException e) {
HiveDecimal hiveDecimal = HiveDecimal.create(lineArray[column]);
batch.cols[column]=new DecimalColumnVector(lineArray[column].length(), 0);
((DecimalColumnVector) batch.cols[column]).vector[row] = (new HiveDecimalWritable(hiveDecimal));
} catch (Exception e) {
e.printStackTrace();
}
} else {
((LongColumnVector) batch.cols[column]).noNulls = false;
((LongColumnVector) batch.cols[column]).isNull[row] = true;
((LongColumnVector) batch.cols[column]).vector[row] = LongColumnVector.NULL_VALUE;
// ((CustomLongColumnVector) batch.cols[column]).fillWithNulls();
}
}