Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
1.2.0
-
None
-
None
Description
We disabled publishing of certain modules in SPARK-3452. One of such modules is spark-yarn. This breaks applications that submit spark jobs programatically with master set as yarn-client. This is because SparkContext is dependent on classes from yarn-client module to submit the YARN application.
Here is the stack trace that you get if you submit the spark job without yarn-client dependency:
2015-01-07 14:39:22,799 [pool-10-thread-13] [info] o.a.s.s.MemoryStore - MemoryStore started with capacity 731.7 MB
Exception in thread "pool-10-thread-13" java.lang.ExceptionInInitializerError
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
at com.myimpl.Server:23)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:236)
at scala.util.Try$.apply(Try.scala:191)
at scala.util.Success.map(Try.scala:236)
at com.myimpl.FutureTry$$anonfun$1.apply(FutureTry.scala:23)
at com.myimpl.FutureTry$$anonfun$1.apply(FutureTry.scala:23)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:236)
at scala.util.Try$.apply(Try.scala:191)
at scala.util.Success.map(Try.scala:236)
at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.spark.SparkException: Unable to load YARN support
at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:199)
at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:194)
at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
... 27 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:195)
... 29 more
Attachments
Issue Links
- duplicates
-
SPARK-5289 Backport publishing of repl, yarn into branch-1.2
- Resolved
- is broken by
-
SPARK-3452 Maven build should skip publishing artifacts people shouldn't depend on
- Resolved