Description
The configuration error in two method. First is ok, and second is wrong.
1. val hConf = HBaseConfiguration.create() // ignore some config val hBaseRDD = sc.newAPIHadoopRDD(hConf, classOf[TableSnapshotInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result]) 2. val hConf = HBaseConfiguration.create() val job = Job.getInstance(hConf) val hBaseRDD = sc.newAPIHadoopRDD(job.getConfiguration, classOf[TableSnapshotInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result]) and the log: Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.codec.prefixtree.PrefixTreeCodec at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.hadoop.hbase.io.encoding.DataBlockEncoding.createEncoder(DataBlockEncoding.java:186) ... 21 more
I'm sure that I have upload jar with spark cmd -jars.
Is this a bug?