Apache spark spark shell上的导入类,其中包名以单词“开始”;火花“;
我已经打开了火花壳。在shell中,我们已经有了一个变量-Apache spark spark shell上的导入类,其中包名以单词“开始”;火花“;,apache-spark,spark-streaming,Apache Spark,Spark Streaming,我已经打开了火花壳。在shell中,我们已经有了一个变量- spark: org.apache.spark.sql.SparkSession 我有一个第三方Jar,它的包名以“spark”开头,比如- 当我尝试在spark shell上导入上述包时,我遇到了异常- scala> import spark.myreads.one.KafkaProducerWrapper <console>:38: error: value myreads is not a member of
spark: org.apache.spark.sql.SparkSession
我有一个第三方Jar,它的包名以“spark”开头,比如-
当我尝试在spark shell上导入上述包时,我遇到了异常-
scala> import spark.myreads.one.KafkaProducerWrapper
<console>:38: error: value myreads is not a member of org.apache.spark.sql.SparkSession
import spark.myreads.one.KafkaProducerWrapper
scala>导入spark.myreads.one.KafkaProducerWrapper
:38:错误:值myreads不是org.apache.spark.sql.SparkSession的成员
导入spark.myreads.one.kafkaproducerrapper
如何在Spark Shell上导入这样的包以解决上述冲突
我正在使用
Spark-2.0.0
,JDK-1.8
和Scala-2.11
使用_root_uuu作为开头部分,如下所示:
import\u root\uu.spark.myreads.one.kafkaproducerrapper
$spark\u HOME/bin/spark shell--打包spark.myreads.one.kafkaproducerrapper
scala> import spark.myreads.one.KafkaProducerWrapper
<console>:38: error: value myreads is not a member of org.apache.spark.sql.SparkSession
import spark.myreads.one.KafkaProducerWrapper