Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
None
-
None
-
None
Description
After doing exactly what is mentioned in the dependency management to automatically add external libraries to SparkCluster, the jars are loaded into the spark driver and we can see them in local-repo but they are not injected to the executors after submitting the job.
We get the error SparkContext java.lang.ClassnotFound.
The same thing works well with Zeppelin 0.8.0 without any problems.