Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

h2o scala code compile error not found object ai

I am trying to comile and run simple h2o scala code. But when I do sbt package I get errors. Am I missing something in the sbt file

This is my h2o scala code

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql._

import ai.h2o.automl.AutoML
import ai.h2o.automl.AutoMLBuildSpec

import org.apache.spark.h2o._

object H2oScalaEg1 {

  def main(args: Array[String]): Unit = {

  val sparkConf1 = new SparkConf().setMaster("local[2]").setAppName("H2oScalaEg1App")

  val sparkSession1 = SparkSession.builder.config(conf = sparkConf1).getOrCreate()

  val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)

  import h2oContext._

  import java.io.File

  import h2oContext.implicits._

  import water.Key

  }

}

And this is my sbt file.

name := "H2oScalaEg1Name"

version := "1.0"

scalaVersion := "2.11.12"

scalaSource in Compile := baseDirectory.value / ""

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.3"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.0"

libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3" % "runtime" pomOnly()

When I do sbt package I get these errors

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:7:8: not found: object ai

[error] import ai.h2o.automl.AutoML

[error]        ^

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:8:8: not found: object ai

[error] import ai.h2o.automl.AutoMLBuildSpec

[error]        ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:10:25: object h2o is not a member of package org.apache.spark

[error] import org.apache.spark.h2o._
[error]                         ^

[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:20:20: not found: value H2OContext
[error]   val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)
[error]                    ^


[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:28:10: not found: value water
[error]   import water.Key
[error]          ^
[error] 5 errors found

How can I fix this problem.

My spark version in spark-2.2.3-bin-hadoop2.7

Thanks,

marrel

like image 589
marrel Avatar asked Mar 19 '26 12:03

marrel


1 Answers

pomOnly() in build.sbt indicates to the dependency management handlers that jar libs/artifacts for this dependency should not be loaded and to only look for the metadata.

Try to use libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3" instead.

Edit 1: Additionally I think you are missing (at least) one library dependency: libraryDependencies += "ai.h2o" % "h2o-automl" % "3.22.1.3"

see: https://search.maven.org/artifact/ai.h2o/h2o-automl/3.22.1.5/pom

Edit 2: The last dependency you are missing is sparkling-water-core: libraryDependencies += "ai.h2o" % "sparkling-water-core_2.11" % "2.4.6" should do the trick.

Here is the github of sparkling-water/core/src/main/scala/org/apache/spark/h2o .

like image 54
Florian Baierl Avatar answered Mar 21 '26 02:03

Florian Baierl



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!