Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Won't Fix
-
2.1.0
-
None
-
None
Description
I want to be able to integrate non hive catalogs into spark. It seems api is pretty close already. This issue proposes opening up existing interfaces for external implementation.
For ExternalCatalog there seems to be already an abstract class and SessionState java doc suggests it should stay a scala class to avoid initialization order
Attachments
Issue Links
- duplicates
-
SPARK-17767 Spark SQL ExternalCatalog API custom implementation support
- Closed
- is duplicated by
-
SPARK-30617 Is there any possible that spark no longer restrict enumerate types of spark.sql.catalogImplementation
- Closed
- links to