Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/vb.net/14.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
从Pyspark加载HBASE时出现问题:未能加载转换器:org.apache.spark.examples.pythonconverters_Python_Apache Spark_Pyspark - Fatal编程技术网

从Pyspark加载HBASE时出现问题:未能加载转换器:org.apache.spark.examples.pythonconverters

从Pyspark加载HBASE时出现问题:未能加载转换器:org.apache.spark.examples.pythonconverters,python,apache-spark,pyspark,Python,Apache Spark,Pyspark,从PySpark加载HBASE时,我收到以下错误消息 “未能加载转换器:org.apache.spark.examples.pythonconverters.StringToImmutableBytesWritableConverter” 我使用的是spark 2.0和Hbase 1.1.2.2.5.0.0-1245 使用以下步骤加载Hbase datamap=temp_rdd.map( lambda (x,y): (str(x),[str(x),"cf1","a",y])) host='xy

从PySpark加载HBASE时,我收到以下错误消息

“未能加载转换器:org.apache.spark.examples.pythonconverters.StringToImmutableBytesWritableConverter”

我使用的是spark 2.0和Hbase 1.1.2.2.5.0.0-1245

使用以下步骤加载Hbase

datamap=temp_rdd.map( lambda (x,y): (str(x),[str(x),"cf1","a",y]))

host='xyz'
table='test'
conf = {"hbase.zookeeper.quorum": host,
 "hbase.mapred.outputtable": table,
 "mapreduce.outputformat.class": "org.apache.hadoop.hbase.mapreduce.TableOutputFormat",
 "mapreduce.job.output.key.class": "org.apache.hadoop.hbase.io.ImmutableBytesWritable",
 "mapreduce.job.output.value.class": "org.apache.hadoop.io.Writable"}
keyConv = "org.apache.spark.examples.pythonconverters.StringToImmutableBytesWritableConverter"
valueConv = "org.apache.spark.examples.pythonconverters.StringListToPutConverter"

 datamap.saveAsNewAPIHadoopDataset(conf=conf,keyConverter=keyConv,valueConverter=valueConv)

有人能帮我吗?

这个类只存在于spark的示例罐中。
要使用它,您需要将spark-examples.jar添加到您的spark.driver.extraClassPath和spark.executor.extraClassPath中。

谢谢RanP。.在使用jar文件执行脚本后,问题得到了解决