Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
2.0.0
-
None
Description
I was testing HOP pipelines on a local environment with Kubernetes. I generated a HOP fat-jar and deployed it on our flink-kubernetes-operator. As a schema registry we are using apicurio-registry-sql (https://github.com/Apicurio/apicurio-registry) that is fully compatible with Confluent schema registry API.
There was an exception in the job manager while using Beam Kafka Produce for writing Avro messages to Kafka using a schema registry
Please see the attached example-kafka-avro-produce.hpl.
The exception that I got in the job manager:
Caused by: java.lang.NullPointerException at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.checkIndexedRecord(AvroCoder.java:629) at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.recurse(AvroCoder.java:497) at org.apache.beam.sdk.coders.AvroCoder$AvroDeterminismChecker.check(AvroCoder.java:476) at org.apache.beam.sdk.coders.AvroCoder.<init>(AvroCoder.java:316) at org.apache.beam.sdk.coders.AvroCoder.<init>(AvroCoder.java:308) at org.apache.beam.sdk.coders.AvroGenericCoder.<init>(AvroGenericCoder.java:26) at org.apache.beam.sdk.coders.AvroGenericCoder.of(AvroGenericCoder.java:30) at org.apache.beam.sdk.coders.AvroCoder.of(AvroCoder.java:151) at org.apache.hop.beam.transforms.kafka.BeamProduceMeta.handleTransform(BeamProduceMeta.java:149) at org.apache.hop.beam.pipeline.HopPipelineMetaToBeamPipelineConverter.handleBeamOutputTransforms(HopPipelineMetaToBeamPipelineConverter.java:344) at org.apache.hop.beam.pipeline.HopPipelineMetaToBeamPipelineConverter.createPipeline(HopPipelineMetaToBeamPipelineConverter.java:219)
Attachments
Attachments
Issue Links
- is related to
-
HOP-4111 Beam: exception reading Kafka Avro with Schema registry
- Closed