Details
-
Bug
-
Status: Reopened
-
Minor
-
Resolution: Unresolved
-
0.12.0, 0.12.1, 0.13.0, 0.13.1, 0.14.0
-
None
-
None
-
Hive 0.12 CDH 5.1.0, Hadoop 2.3.0 CDH 5.1.0
-
Parquet
Description
When reading Parquet file, where the original Thrift schema contains a struct with an enum, this causes the following error (full stack trace blow):
java.lang.NoSuchFieldError: DECIMAL.
Example Thrift Schema:
enum MyEnumType {
EnumOne,
EnumTwo,
EnumThree
}
struct MyStruct {
1: optional MyEnumType myEnumType;
2: optional string field2;
3: optional string field3;
}
struct outerStruct {
1: optional list<MyStruct> myStructs
}
Hive Table:
CREATE EXTERNAL TABLE mytable ( mystructs array<struct<myenumtype: string, field2: string, field3: string>> ) ROW FORMAT SERDE 'parquet.hive.serde.ParquetHiveSerDe' STORED AS INPUTFORMAT 'parquet.hive.DeprecatedParquetInputFormat' OUTPUTFORMAT 'parquet.hive.DeprecatedParquetOutputFormat' ;
Error Stack trace:
Java stack trace for Hive 0.12:
Caused by: java.lang.NoSuchFieldError: DECIMAL
at org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter.getNewConverter(ETypeConverter.java:146)
at org.apache.hadoop.hive.ql.io.parquet.convert.HiveGroupConverter.getConverterFromDescription(HiveGroupConverter.java:31)
at org.apache.hadoop.hive.ql.io.parquet.convert.ArrayWritableGroupConverter.<init>(ArrayWritableGroupConverter.java:45)
at org.apache.hadoop.hive.ql.io.parquet.convert.HiveGroupConverter.getConverterFromDescription(HiveGroupConverter.java:34)
at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:64)
at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:47)
at org.apache.hadoop.hive.ql.io.parquet.convert.HiveGroupConverter.getConverterFromDescription(HiveGroupConverter.java:36)
at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:64)
at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableGroupConverter.<init>(DataWritableGroupConverter.java:40)
at org.apache.hadoop.hive.ql.io.parquet.convert.DataWritableRecordConverter.<init>(DataWritableRecordConverter.java:32)
at org.apache.hadoop.hive.ql.io.parquet.read.DataWritableReadSupport.prepareForRead(DataWritableReadSupport.java:128)
at parquet.hadoop.InternalParquetRecordReader.initialize(InternalParquetRecordReader.java:142)
at parquet.hadoop.ParquetRecordReader.initializeInternalReader(ParquetRecordReader.java:118)
at parquet.hadoop.ParquetRecordReader.initialize(ParquetRecordReader.java:107)
at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:92)
at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:66)
at org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:51)
at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:65)
... 16 more
Attachments
Attachments
Issue Links
- is related to
-
HIVE-6367 Implement Decimal in ParquetSerde
- Closed