Details
-
Improvement
-
Status: Open
-
Blocker
-
Resolution: Unresolved
-
None
-
None
Description
Create common SchemaProvider and RecordPayloads for spark, flink etc.
- Currently the class org.apache.hudi.utilities.schema.SchemaProvider takes in input JavaSparkContext, and is specific to Spark Engine. So we have created a separate SchemaProvider for flink. Now for Kafka connect, we can use neither, since its neither spark nor flink. Implement a common class that uses HoodieEngineContext ..