Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Invalid
-
1.6.0
-
None
-
None
-
None
Description
I backported ORC-189 to my own branch and run tests in Hive. I am getting the following exception in a test related to schema evolution from double to timestamp after applying ORC-189:
Caused by: java.io.IOException: Error reading file: file:/Users/jcamachorodriguez/src/workspaces/hive/itests/qtest/target/localfs/warehouse/part_change_various_various_timestamp_n6/part=1/000000_0 at org.apache.orc.impl.RecordReaderImpl.nextBatch(RecordReaderImpl.java:1289) at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.ensureBatch(RecordReaderImpl.java:87) at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.hasNext(RecordReaderImpl.java:103) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:252) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:227) at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:361) ... 23 more Caused by: java.io.EOFException: Read past EOF for compressed stream Stream for column 7 kind DATA position: 15 length: 15 range: 0 offset: 122 limit: 122 range 0 = 0 to 15 uncompressed: 12 to 12 at org.apache.orc.impl.SerializationUtils.readFully(SerializationUtils.java:125) at org.apache.orc.impl.SerializationUtils.readLongLE(SerializationUtils.java:108) at org.apache.orc.impl.SerializationUtils.readDouble(SerializationUtils.java:104) at org.apache.orc.impl.TreeReaderFactory$DoubleTreeReader.nextVector(TreeReaderFactory.java:783) at org.apache.orc.impl.ConvertTreeReaderFactory$TimestampFromDoubleTreeReader.nextVector(ConvertTreeReaderFactory.java:1883) at org.apache.orc.impl.TreeReaderFactory$StructTreeReader.nextBatch(TreeReaderFactory.java:2012) at org.apache.orc.impl.RecordReaderImpl.nextBatch(RecordReaderImpl.java:1282) ... 28 more
Attachments
Attachments
Issue Links
- relates to
-
ORC-189 Add timestamp with local timezone
- Closed