Apache spark spark数据帧插入到hbase错误

Apache spark spark数据帧插入到hbase错误,apache-spark,dataframe,hbase,apache-spark-sql,Apache Spark,Dataframe,Hbase,Apache Spark Sql,我有一个具有此模式的数据帧: |-- Name1: string (nullable = true) |-- Name2: string (nullable = true) |-- App: string (nullable = true) ... |-- Duration: float (nullable = false) 我想将其插入hbase表。我用了很多。 我定义目录: def catalog = s"""{ |"table":{"namespace":"defa

我有一个具有此模式的数据帧:

 |-- Name1: string (nullable = true)
 |-- Name2: string (nullable = true)
 |-- App: string (nullable = true)
...
 |-- Duration: float (nullable = false)
我想将其插入hbase表。我用了很多。 我定义目录:

def catalog = s"""{
       |"table":{"namespace":"default", "name":"otarie"},
       |"rowkey":"key",
       |"columns":{
         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
         |"col1":{"cf":"cf1", "col":"Name1", "type":"boolean"},
         |"col2":{"cf":"cf2", "col":"Name2", "type":"double"},
         |"col3":{"cf":"cf3", "col":"App", "type":"float"},
            ........
         |"co27":{"cf":"cf27", "col":"Duration", "type":"string"}
       |}
     |}""".stripMargin
然后我试着写我的数据帧:

Append_Ot.write.options(Map(HBaseTableCatalog.tableCatalog -> catalog, HBaseTableCatalog.newTable -> "5")).format("org.apache.hadoop.hbase.spark ").save()
我使用的是spark shell,出现以下错误:

 <console>:155: error: not found: value HBaseTableCatalog
:155:错误:未找到:值HBaseTableCatalog