Scala H2O隐式转换引发编译错误

Scala H2O隐式转换引发编译错误,scala,apache-spark,h2o,Scala,Apache Spark,H2o,下面的代码在指定帧时抛出错误,很可能是隐式转换出错。错误是: 类型失配;找到:org.apache.spark.h2o.RDD[Int](扩展) to)org.apache.spark.rdd.rdd[Int]必需: org.apache.spark.h2o.H2OFrame(扩展为)water.fvec.H2OFrame 以及守则: import org.apache.spark.h2o._ import org.apache.spark._ import org.apache.spark.

下面的代码在指定帧时抛出错误,很可能是隐式转换出错。错误是:

类型失配;找到:org.apache.spark.h2o.RDD[Int](扩展) to)org.apache.spark.rdd.rdd[Int]必需: org.apache.spark.h2o.H2OFrame(扩展为)water.fvec.H2OFrame

以及守则:

import org.apache.spark.h2o._

import org.apache.spark._
import org.apache.spark.SparkContext._

object App1 extends App{

         val conf = new SparkConf()
         conf.setAppName("Test")
         conf.setMaster("local[1]")
         conf.set("spark.executor.memory","1g");

         val sc = new SparkContext(conf)

         val rawData = sc.textFile("c:\\spark\\data.csv")        
         val data = rawData.map(line => line.split(',').map(_.toDouble))    
         val response: RDD[Int] = data.map(row => row(0).toInt)

         val h2oResponse: H2OFrame = response   // <-- this line throws the error
         sc.stop

}
import org.apache.spark.h2o_
导入org.apache.spark_
导入org.apache.spark.SparkContext_
对象App1扩展了App{
val conf=new SparkConf()
conf.setAppName(“测试”)
conf.setMaster(“本地[1]”)
conf.set(“spark.executor.memory”,“1g”);
val sc=新的SparkContext(配置)
val rawData=sc.textFile(“c:\\spark\\data.csv”)
val data=rawData.map(line=>line.split(',').map(uu.toDouble))
val响应:RDD[Int]=data.map(行=>行(0.toInt)
val h2oResponse:H2OFrame=response/您所缺少的只是h2oContext的隐式as


您是否尝试导入
h2oContext.implicits.\u
(它将
RDD
隐式转换为
H2OFrame
。Ref:我收到以下错误:
值toDF不是org.apache.spark.h2o.RDD[Int]的成员
您需要
导入sqlContext.implicits.\u
以及为什么
sqlContext.implicits
?另外,我没有变量
sqlContext
.toDF()在sqlContext.implicits.中可用。您可以在创建sparkContextNow时创建sqlContext,它可以编译,但当我运行时,我得到
当通过--packages选项使用SparkWater作为Spark软件包时,由于Spark依赖项解析中存在错误,必须显式指定“no.priv.garshol.duke:duke:1.2”依赖项。
import h2oContext.implicits._
val h2oResponse: H2OFrame = response.toDF()