Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
None
-
None
-
None
-
None
Description
given this query:
CREATE TABLE map_of_doubles ( id int, doubleMap map<double, double> ) stored as orc; insert overwrite table map_of_doubles SELECT 1, MAP(CAST(1.0 as DOUBLE), null, CAST(2.0 as DOUBLE), CAST(3.0 as DOUBLE)); select id, doubleMap from map_of_doubles; select id, doubleMap[1] from map_of_doubles group by id, doubleMap[1]; -- this fails
error is:
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.ql.exec.vector.DoubleColumnVector cannot be cast to org.apache.hadoop.hive.ql.exec.vector.LongColumnVector
at org.apache.hadoop.hive.ql.exec.vector.expressions.VectorUDFMapIndexLongScalar.findScalarInMap(VectorUDFMapIndexLongScalar.java:67)
at org.apache.hadoop.hive.ql.exec.vector.expressions.VectorUDFMapIndexBaseScalar.evaluate(VectorUDFMapIndexBaseScalar.java:132)
at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:146)
... 23 more
I found this error while was trying to write q test cases for all data types in HIVE-23688, so this issue needs to be addressed first
HIVE-23688 is Parquet-specific, this one is not, it can be reproduced for ORC and Parquet too
Attachments
Issue Links
- relates to
-
HIVE-23688 Vectorization: IndexArrayOutOfBoundsException For map type column which includes null value
- Closed