Details
-
Improvement
-
Status: Open
-
Minor
-
Resolution: Unresolved
-
1.9.1
Description
When do the release check of release-1.9.1-rc1, the ClassNotFoundException is found when go through the wordcount example in Local Setup Tutorial.
You can find the exception in the log file of `flink-xxx-client-MacBook-Pro-2.local.log`
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnException at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.flink.client.cli.CliFrontend.loadCustomCommandLine(CliFrontend.java:1187) at org.apache.flink.client.cli.CliFrontend.loadCustomCommandLines(CliFrontend.java:1147) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1072) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.exceptions.YarnException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 5 more
We know that Hadoop is not pre-bundled in Flink anymore but it would be nice to avoid this exception in order not to bring confusion to new users when they run programs under the local cluster?