Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-39060

Typo in error messages of decimal overflow

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.2.1
    • 3.1.3, 3.0.4, 3.3.0, 3.2.2, 3.4.0
    • SQL
    • None

    Description

         org.apache.spark.SparkArithmeticException 

         Decimal(expanded,10000000000000000000000000000000000000.1,39,1}) cannot be represented as Decimal(38, 1). If necessary set spark.sql.ansi.enabled to false to bypass this error.
       

      As shown in decimalArithmeticOperations.sql.out

      Notice the extra } before ‘cannot’

       
       
       
       

      Attachments

        Activity

          People

            vli-databricks Vitalii Li
            vli-databricks Vitalii Li
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: