I am trying to run a Spark program in Scala IDE for Eclipse using Maven. However, I get the java.lang.NoClassDefFoundError error on the line where I am initializing SparkConf. I also tried to add a dependency for Guava 14.0.1 but didn't solved my problem as well. :
Exception in thread "main" java.lang.NoClassDefFoundError: org/spark_project/guava/cache/CacheLoader
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
at oursparkapp2.SimpleApp$.main(SimpleApp.scala:8)
at oursparkapp2.SimpleApp.main(SimpleApp.scala)
Caused by: java.lang.ClassNotFoundException: org.spark_project.guava.cache.CacheLoader
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
The scala program (SimpleApp.scala) that I am trying to run is as follows:
package oursparkapp2
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
object SimpleApp {
def main(args:Array[String]) {
val conf = new SparkConf().setAppName("Hlelo")
}
}
My pom.xml file is as follows:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>ez.spark</groupId>
<artifactId>oursparkapp2</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
</project>
Furthermore, Spark is running fine on my Terminal using the spark-shell command.
I solved this problem by adding the dependency:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-network-common_2.11</artifactId>
<version>2.1.0</version>
</dependency>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With