Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
connectors-6.0.0
-
None
-
None
Description
We need a standardized way to set configurations for Spark driver and workers when reading and writing data.
Quoting from my offline discussion with twdsilva@gmail.com:
"if we use phoenixTableAsRDD to create the rdd then we can pass in a config
if we use the standard df.write.format() we can't pass in a conf
maybe the best was to do this is to put config in the option map when we call read/write .format and then set those in the config"
Attachments
Attachments
Issue Links
- causes
-
PHOENIX-6566 shaded phoenix connectors include restricitve log4j config files
- Resolved
- links to