Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Won't Fix
-
1.2.0
-
None
-
None
Description
Spark support dynamically add jar to executor classpath through SparkContext::addJar(), while it does not support dynamically add jar into driver classpath. In most case(if not all the case), user dynamically add jar with SparkContext::addJar() because some classes from the jar would be referred in upcoming Spark job, which means the classes need to be loaded in Spark driver side either,e.g during serialization. I think it make sense to add an API to add jar into driver classpath, or just make it available in SparkContext::addJar(). HIVE-9410 is a real case from Hive on Spark.
Attachments
Issue Links
- is depended upon by
-
SPARK-3145 Hive on Spark umbrella
- Resolved