Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-7446

NegativeArraySizeException when running MR jobs with large data size

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.4.0
    • 3.4.0
    • mrv1
    • Reviewed

    Description

      We are using bit shifting to double the byte array in IFile's nextRawValue method to store the byte values in it. With large dataset it can easily happen that we shift the leftmost bit when we are calculating the size of the array, which can lead to a negative number as the array size, causing the NegativeArraySizeException.

      It would be safer to expand the backing array with a 1.5x factor, and have a check not to extend Integer's max value during that.

      Attachments

        Activity

          People

            pszucs Peter Szucs
            pszucs Peter Szucs
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: