Uploaded image for project: 'Apache Drill'
  1. Apache Drill
  2. DRILL-4410

ListVector causes OversizedAllocationException

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.6.0
    • Server
    • None

    Description

      Reading large data set with array/list causes the following problem. This happens when union type is enabled.

      (org.apache.drill.exec.exception.OversizedAllocationException) Unable to expand the buffer. Max allowed buffer size is reached.
      org.apache.drill.exec.vector.UInt1Vector.reAlloc():214
      org.apache.drill.exec.vector.UInt1Vector$Mutator.setSafe():406
      org.apache.drill.exec.vector.complex.ListVector$Mutator.setNotNull():298
      org.apache.drill.exec.vector.complex.ListVector$Mutator.startNewValue():307
      org.apache.drill.exec.vector.complex.impl.UnionListWriter.startList():563
      org.apache.drill.exec.vector.complex.impl.ComplexCopier.writeValue():115
      org.apache.drill.exec.vector.complex.impl.ComplexCopier.copy():100
      org.apache.drill.exec.vector.complex.ListVector.copyFrom():97
      org.apache.drill.exec.vector.complex.ListVector.copyFromSafe():89
      org.apache.drill.exec.test.generated.HashJoinProbeGen197.projectBuildRecord():356
      org.apache.drill.exec.test.generated.HashJoinProbeGen197.executeProbePhase():173
      org.apache.drill.exec.test.generated.HashJoinProbeGen197.probeAndProject():223
      org.apache.drill.exec.physical.impl.join.HashJoinBatch.innerNext():233
      org.apache.drill.exec.record.AbstractRecordBatch.next():162
      org.apache.drill.exec.record.AbstractRecordBatch.next():119
      org.apache.drill.exec.record.AbstractRecordBatch.next():109
      org.apache.drill.exec.record.AbstractSingleRecordBatch.innerNext():51
      org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():129
      org.apache.drill.exec.record.AbstractRecordBatch.next():162
      org.apache.drill.exec.record.AbstractRecordBatch.next():119
      org.apache.drill.exec.record.AbstractRecordBatch.next():109
      org.apache.drill.exec.record.AbstractSingleRecordBatch.innerNext():51
      org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():129
      org.apache.drill.exec.record.AbstractRecordBatch.next():162
      org.apache.drill.exec.physical.impl.BaseRootExec.next():104
      org.apache.drill.exec.physical.impl.SingleSenderCreator$SingleSenderRootExec.innerNext():92
      org.apache.drill.exec.physical.impl.BaseRootExec.next():94
      org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():257
      org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():251
      java.security.AccessController.doPrivileged():-2
      javax.security.auth.Subject.doAs():422
      org.apache.hadoop.security.UserGroupInformation.doAs():1657
      org.apache.drill.exec.work.fragment.FragmentExecutor.run():251
      org.apache.drill.common.SelfCleaningRunnable.run():38
      java.util.concurrent.ThreadPoolExecutor.runWorker():1142
      java.util.concurrent.ThreadPoolExecutor$Worker.run():617
      java.lang.Thread.run():745 (state=,code=0)

      Attachments

        Activity

          People

            minjikim MinJi Kim
            minjikim MinJi Kim
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: