Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

run spark locally with intellij

I wrote this :

import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession

object ProcessingApp extends App {
  val sparkConf = new SparkConf()
    .setAppName("er")
    .setMaster("local")
  val sparkSession: SparkSession = SparkSession.builder().config(sparkConf).getOrCreate()

  val test = sparkSession.version

  println(test)

}

I want to run it locally with my Intellij IDE by right click on the run ProcessingApp but this doesn't work , I made my spark dependencies not provided at the build.sbt file level. I am getting this error:

Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass
like image 871
scalacode Avatar asked Oct 21 '25 12:10

scalacode


1 Answers

change the scope of all the spark dependencies from provided to compile

like image 173
Chitral Verma Avatar answered Oct 24 '25 02:10

Chitral Verma



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!