Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/cassandra/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/vue.js/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 错误:值cassandraTable不是org.apache.spark.SparkContext的成员_Apache Spark_Cassandra_Connector - Fatal编程技术网

Apache spark 错误:值cassandraTable不是org.apache.spark.SparkContext的成员

Apache spark 错误:值cassandraTable不是org.apache.spark.SparkContext的成员,apache-spark,cassandra,connector,Apache Spark,Cassandra,Connector,我想访问Spark中的Cassandra表。下面是我正在使用的版本 spark:spark-1.4.1-bin-hadoop2.6 卡桑德拉:apache-cassandra-2.2.3 spark cassandra连接器:spark-cassandra-connector-java_2.10-1.5.0-M2.jar 以下是脚本: sc.stop 导入com.datastax.spark.connector.\ux,org.apache.spark.SparkContext,org.ap

我想访问Spark中的Cassandra表。下面是我正在使用的版本

  • spark:spark-1.4.1-bin-hadoop2.6
  • 卡桑德拉:apache-cassandra-2.2.3
  • spark cassandra连接器:spark-cassandra-connector-java_2.10-1.5.0-M2.jar
以下是脚本:

sc.stop
导入com.datastax.spark.connector.\ux,org.apache.spark.SparkContext,org.apache.spark.SparkContext.\ux,org.apache.spark.SparkConf
val conf=new SparkConf(true).set(“spark.cassandra.connection.host”、“localhost”)
val sc=新的SparkContext(配置)
val test_spark_rdd=sc.cassandraTable(“测试1”,“单词”)
当我运行最后一条语句时,我得到一个错误

:32:错误:值cassandraTable不是的成员 org.apache.spark.SparkContext val test_spark_rdd=sc.cassandraTable(“测试1”,“单词”)

解决错误的提示将非常有用


谢谢

实际上,在shell上,您需要导入相应的软件包。不需要做任何额外的事情


e、 g.scala>导入com.datastax.spark.connector.\ux

首先,连接器版本应该与Spark版本匹配,所以对于Spark 1.4,您应该使用连接器1.4。我尝试了Spark-cassandra-connector-java_2.10-1.4.0.jar版本。仍然是相同的错误您确定导入语句没有引发异常吗?如果没有,你将如何发射外壳--包?我使用下面的hte命令启动shell。/bin/spark shell--jars/opt/spark-1.4.0-bin-hadoop2.6/lib/spark-cassandra-connector_2.10-1.4.0.jar