I was trying out hbase-spark connector. To start with, I am trying out this code. My pom dependencies are:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>2.0.0-alpha4</version>
</dependency>
</dependencies>
I am getting following exception while running the code: Exception in thread "main"
java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.hadoop.hbase.spark.JavaHBaseContext.<init>(JavaHBaseContext.scala:46)
at com.myproj.poc.sparkhbaseneo4j.App.main(App.java:71)Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 14 more
com.myproj.poc.sparkhbaseneo4j.App.main(App.java:71)
is line 67 in github code.
I checked this thread. It says that I should include same versions of all libraries. Earlier, I had 2.3.0 versions of spark libraries in my pom. But I realized that
hbase-spark
has latest version
2.0.0
. So I downgraded versions of of all spark libraries to
2.0.0
. But I am still getting the same exception.
Or do I have to stick to
1.X.X
versions only for using this, as this answer says its been removed after version 1.5.2?