I have written a java program for spark, but I am not able to run it from the command line.
I have followed the steps given in the Quick start guide, but I am getting the following error. Please help me out with this problem.
Here is the error :
hadoopnod@hadoopnod:~/spark-1.2.1/bin$ ./run-example "SimpleApp " --master local /home/hadoopnod/Spark_Java/target/simple-project-1.0.jarjava.lang.ClassNotFoundException: org.apache.spark.examples.SimpleApp
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:342)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Create a JAR file using following command. You can find the SimpleApp.class file in "target/classes" folder. cd to this directory.
jar cfve file.jar SimpleApp.class
Put this JAR file into your project in target directory. This JAR file contains the dependency of your SimpleApp class while submitting your job to Spark.
cd to your spark directory. I am using spark-1.4.0-bin-hadoop2.6. Your cmd looks like this.
spark-1.4.0-bin-hadoop2.6>
Submit your spark program using Spark Submit. If you have structure like Harsha has explained in another answer then provide
--class org.apache.spark.examples.SimpleApp
else
--class SimpleApp
Finally submit your spark program.
spark-1.4.0-bin-hadoop2.6>./bin/spark-submit --class SimpleApp --master local[2] /home/hadoopnod/Spark_Java/target/file.jar
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With