Python 模块Datastax Spark Cassandra连接器的导入错误

Python 模块Datastax Spark Cassandra连接器的导入错误,python,pyspark,spark-streaming,datastax,spark-cassandra-connector,Python,Pyspark,Spark Streaming,Datastax,Spark Cassandra Connector,我尝试使用以下命令运行python spark shell: bin/pyspark --packages datastax:spark-cassandra-connector:1.5.0-RC1-s_2.11,org.apache.spark:spark-streaming-kafka_2.10:1.6.0 以下命令的输出显示它能够找到spark cassandra连接器包: resolving dependencies :: org.apache.spark#spark-submit-p

我尝试使用以下命令运行python spark shell:

 bin/pyspark --packages datastax:spark-cassandra-connector:1.5.0-RC1-s_2.11,org.apache.spark:spark-streaming-kafka_2.10:1.6.0
以下命令的输出显示它能够找到spark cassandra连接器包:

resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
   confs: [default]
   found datastax#spark-cassandra-connector;1.5.0-RC1-s_2.11 in spark-packages
   found org.apache.cassandra#cassandra-clientutil;2.2.2 in central
   found com.datastax.cassandra#cassandra-driver-core;3.0.0-rc1 in central
   found io.netty#netty-handler;4.0.33.Final in central
   found io.netty#netty-buffer;4.0.33.Final in central
   found io.netty#netty-common;4.0.33.Final in central
但是,当我尝试使用以下任何命令导入包时,会出现导入错误:

from com.datastax import *
from com.datastax.spark.connector import *
输出:

ImportError: No module named com.datastax
ImportError: No module named com.datastax.spark.connector

有人能告诉我这里出了什么问题吗?

据我所知,Cassandra Connector没有一行Python代码,更不用说命名古怪的Python模块了。Python互操作性是使用数据源API实现的,无需任何额外的导入即可使用该API

sqlContext.read.format("org.apache.spark.sql.cassandra").options(...).load(...)

即使是
——包
也只用于分发JVM依赖项。外部依赖项(Python,R)必须独立地分发或安装,例如使用
PyFiles

您好,您是如何解决这个问题的?请告诉我