Uploaded image for project: 'Apache Avro'
  1. Apache Avro
  2. AVRO-2198

BigDecimal (logical type=Decimal / type = bytes) GenericRecord to SpecificRecord Conversion Issue

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 1.8.2
    • 1.8.2
    • java
    • None
    • Important

    Description

      There seems to be an issue with the conversion process from a byte array to a BigDecimal when converting to a SpecificRecord from a GenericRecord object. 

      Below is a simple avro definition with "amount" defined as logical type decimal and type bytes. The avroData specific class has been generated with enablebigdecimal = true. 

      See below example.

       An amount value of 20000000.11 is converted to BigDecimal value of 606738534879530359915932.65

      
      String schema = "{\"type\":\"record\",\"name\":\"avroTrans\",\"namespace\":\"com.demo.KafkaStream\",\"fields\":[{\"name\":\"amount\",\"type\":{\"type\":\"bytes\",\"logicalType\":\"decimal\",\"precision\":5,\"scale\":2}}]}";
      String json = "{\"amount\": \"20000000.11\"}";
      
      Schema avroSchema = new Schema.Parser().parse(schema);
      
      GenericRecord obj = Utils.jsonToAvro(json, avroSchema);
      
      System.out.println("GenericRecord Object Value ->" + obj);
      
      GenericDatumWriter<GenericRecord> writer = new GenericDatumWriter<GenericRecord>(avroTrans.getClassSchema());
      ByteArrayOutputStream out = new ByteArrayOutputStream();
      Encoder encoder = EncoderFactory.get().binaryEncoder(out, null);
      writer.write(obj, encoder);
      encoder.flush();
      
      byte[] avroData = out.toByteArray();
      out.close();
      
      SpecificDatumReader<avroTrans> reader2 = new SpecificDatumReader<avroTrans>(avroTrans.class);
      Decoder decoder2 = DecoderFactory.get().binaryDecoder(avroData, null);
      avroTrans customRecord = reader2.read(null, decoder2);
      
      System.out.println("SpecificRecord Object Value -> " + customRecord);
      

      Output:

      GenericRecord Object Value ->{"amount": {"bytes": "20000000.11"}}
      SpecificRecord Object Value -> {"amount": 606738534879530359915932.65}

      Within org.apache.avro.Conversion there is a fromBytes conversion method which takes a bytebuffer as input (see below).

       

          @Override
      
          public BigDecimal fromBytes(ByteBuffer value, Schema schema, LogicalType type) {
            int scale = ((LogicalTypes.Decimal) type).getScale();
            // always copy the bytes out because BigInteger has no offset/length ctor
            byte[] bytes = new byte[value.remaining()];
            value.get(bytes);
            return new BigDecimal(new BigInteger(bytes), scale);
         }
      

      The BigInteger constructor (bytes) used version is to "Translates a byte array containing the two's-complement binary representation of a BigInteger into a BigInteger."

      Could use of the BigInteger(Bytes) causing the incorrect conversion to huge number? 

      Attachments

        Activity

          People

            Unassigned Unassigned
            bourgamb ABourg
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: