从Pyspark加载HBASE时出现问题:未能加载转换器:org.apache.spark.examples.pythonconverters
从PySpark加载HBASE时,我收到以下错误消息 “未能加载转换器:org.apache.spark.examples.pythonconverters.StringToImmutableBytesWritableConverter” 我使用的是spark 2.0和Hbase 1.1.2.2.5.0.0-1245 使用以下步骤加载Hbase从Pyspark加载HBASE时出现问题:未能加载转换器:org.apache.spark.examples.pythonconverters,python,apache-spark,pyspark,Python,Apache Spark,Pyspark,从PySpark加载HBASE时,我收到以下错误消息 “未能加载转换器:org.apache.spark.examples.pythonconverters.StringToImmutableBytesWritableConverter” 我使用的是spark 2.0和Hbase 1.1.2.2.5.0.0-1245 使用以下步骤加载Hbase datamap=temp_rdd.map( lambda (x,y): (str(x),[str(x),"cf1","a",y])) host='xy
datamap=temp_rdd.map( lambda (x,y): (str(x),[str(x),"cf1","a",y]))
host='xyz'
table='test'
conf = {"hbase.zookeeper.quorum": host,
"hbase.mapred.outputtable": table,
"mapreduce.outputformat.class": "org.apache.hadoop.hbase.mapreduce.TableOutputFormat",
"mapreduce.job.output.key.class": "org.apache.hadoop.hbase.io.ImmutableBytesWritable",
"mapreduce.job.output.value.class": "org.apache.hadoop.io.Writable"}
keyConv = "org.apache.spark.examples.pythonconverters.StringToImmutableBytesWritableConverter"
valueConv = "org.apache.spark.examples.pythonconverters.StringListToPutConverter"
datamap.saveAsNewAPIHadoopDataset(conf=conf,keyConverter=keyConv,valueConverter=valueConv)
有人能帮我吗?这个类只存在于spark的示例罐中。
要使用它,您需要将spark-examples.jar添加到您的spark.driver.extraClassPath和spark.executor.extraClassPath中。谢谢RanP。.在使用jar文件执行脚本后,问题得到了解决