Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
2.0.0
-
None
Description
I was testing HOP pipelines on a local environment with Kubernetes. I generated a HOP fat-jar and deployed it on our flink-kubernetes-operator. As a schema registry we are using apicurio-registry-sql (https://github.com/Apicurio/apicurio-registry) that is fully compatible with Confluent schema registry API.
There was an exception in the task manager while using Beam Kafka Consume for reading Avro messages from Kafka using a schema registry.
Please see the attached example-kafka-avro-consume.hpl. I was running it with the real ip of apicurio-registry and verified that the URL worked.
The exception that I got in the task manager:
Caused by: java.lang.ClassNotFoundException: io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe at java.base/java.net.URLClassLoader.findClass(Unknown Source) at java.base/java.lang.ClassLoader.loadClass(Unknown Source) at org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:64) at org.apache.flink.util.ChildFirstClassLoader.loadClassWithoutExceptionHandling(ChildFirstClassLoader.java:74) at org.apache.flink.util.FlinkUserCodeClassLoader.loadClass(FlinkUserCodeClassLoader.java:48) at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
Attachments
Attachments
Issue Links
- relates to
-
HOP-4112 Beam: exception writing Kafka Avro messages with Schema registry
- Closed