Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

NoSuchMethodError: org.apache.spark.internal.Logging

I'm trying to run an application made up with spark structured streaming - data input from kafka. Spark version is 2.4.0, scala version is 2.12.7. And I'm making multiple fat-jar using sbt - my project is multi-module project. Building jar is not a problem. When I try to spark-submit with my jar, NoSuchMethodError is occured.

  1. I removed provided scope from spark-sql-kafka-0-10.

val sparkSqlKafka = "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion

  1. In assemblyMergeStrategy I added line below.

case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat

This is whole error log.

2019-01-08 11:55:12 ERROR ApplicationMaster:91 - User class threw exception: java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider org.apache.spark.sql.kafka010.KafkaSourceProvider could not be instantiated
java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider org.apache.spark.sql.kafka010.KafkaSourceProvider could not be instantiated
    at java.util.ServiceLoader.fail(ServiceLoader.java:232)
    at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
    at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
    at scala.collection.Iterator$class.foreach(Iterator.scala:891)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
    at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
    at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
    at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:630)
    at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:161)
    at ThisIsMyClass$.main(ThisIsMyClass.scala:28)
    at ThisIsMyClass.main(ThisIsMyClass.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.internal.Logging.$init$(Lorg/apache/spark/internal/Logging;)V
    at org.apache.spark.sql.kafka010.KafkaSourceProvider.<init>(KafkaSourceProvider.scala:44)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
    at java.lang.Class.newInstance(Class.java:442)
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
    ... 19 more


Edit 1.

Whole dependencies below.

val sparkVersion = "2.4.0"
val typesafeConfigVersion = "1.3.3"
val scalaTestVersion = "3.0.5"
val junitVersion = "4.12"
val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
val sparkSql = "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
val sparkMllib = "org.apache.spark" %% "spark-mllib" % sparkVersion % "provided"
val typesafeConfig = "com.typesafe" % "config" % typesafeConfigVersion
val scalaTest = "org.scalatest" %% "scalatest" % scalaTestVersion % Test
val junit = "junit" % "junit" % "4.12" % Test
val logback = "ch.qos.logback" % "logback-classic" % "1.2.3"
val scalaLogging = "com.typesafe.scala-logging" %% "scala-logging" % "3.9.0"
val sparkStreaming = "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
val sparkSqlKafka = "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion


Edit 2.

I found some dependency version issue with my slf4j-api.

So I've changed my build configuration to using only one version of slf4j-api - version matches with spark-core dependency. And excluded other slf4j-api.

And still SAME Error. :(


Edit 3.

I've added --packages org.apache.spark:spark-sql-kafka-0-10_2.12:2.4.0 in my spark-submit script.

And still SAME Error. 😢

like image 469
nullmari Avatar asked Oct 20 '25 07:10

nullmari


1 Answers

Problem solved.

When I opened spark-shell, I found spark version is 2.4.0 and scala version is 2.11.
My scala version in build.sbt was 2.12.

Scala version was the key!

Thanks for all of you.

like image 181
nullmari Avatar answered Oct 23 '25 01:10

nullmari



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!