Apache spark Spark查询将键空间附加到Spark临时表

Apache spark Spark查询将键空间附加到Spark临时表,apache-spark,cassandra,Apache Spark,Cassandra,我有一个cassandraSQLContext,在这里我可以这样做: cassandraSqlContext.setKeyspace("test"); 因为如果我不这样做,它会抱怨我设置了默认的键空间 现在,当我运行这段代码时: def insertIntoCassandra(siteMetaData: MetaData, dataFrame: DataFrame): Unit ={ System.out.println(dataFrame.show())

我有一个cassandraSQLContext,在这里我可以这样做:

cassandraSqlContext.setKeyspace("test");
因为如果我不这样做,它会抱怨我设置了默认的键空间

现在,当我运行这段代码时:

      def insertIntoCassandra(siteMetaData: MetaData, dataFrame: DataFrame): Unit ={
        System.out.println(dataFrame.show())
        val tableName = siteMetaData.getTableName.toLowerCase()
    dataFrame.registerTempTable("spark_"+ tableName)
    System.out.println("Registered the spark table to spark_" + tableName)

    val columns = columnMap.get(siteMetaData.getTableName)
      val query = cassandraQueryBuilder.buildInsertQuery("test", tableName, columns)
      System.out.println("Query: " + query);
    cassandraSqlContext.sql(query)
      System.out.println("Query executed")
  }
它向我提供了以下错误日志:

Registered the spark table to spark_test
Query: INSERT INTO TABLE test.tablename SELECT **the columns here** FROM spark_tablename
17/02/28 04:15:53 ERROR JobScheduler: Error running job streaming job 1488255351000 ms.0
java.util.concurrent.ExecutionException: java.io.IOException: Couldn't find test.tablename or any similarly named keyspace and table pairs
我不明白的是,为什么cassandraSQLContext不执行打印出来的查询,为什么它会将键空间附加到spark表中

public String buildInsertQuery(String activeReplicaKeySpace, String tableName, String columns){
    String sql = "INSERT INTO TABLE " + activeReplicaKeySpace + "." + tableName +
        " SELECT " + columns + " FROM spark_" + tableName;
    return sql;
  }

问题是我使用了两个不同的cassandraSQLContext实例。在其中一个方法中,我实例化了一个新的cassandraSQLContext,它与传递给insertIntoCassandra方法的cassandraSQLContext冲突