是否需要任何特定的sbt版本来编译cassandra spark连接器

是否需要任何特定的sbt版本来编译cassandra spark连接器,cassandra,apache-spark,datastax-enterprise,Cassandra,Apache Spark,Datastax Enterprise,我正在组装“卡桑德拉火花连接器”。我只是遵循以下步骤: Git克隆连接器代码 运行“sbt组件” 在装配阶段,我遇到以下错误: [info] Done updating. [warn] There may be incompatibilities among your library dependencies. [warn] Here are some of the libraries that were evicted: [warn] * com.eed3si9n:sbt-assembly:

我正在组装“卡桑德拉火花连接器”。我只是遵循以下步骤:

  • Git克隆连接器代码
  • 运行“sbt组件”
  • 在装配阶段,我遇到以下错误:

    [info] Done updating.
    [warn] There may be incompatibilities among your library dependencies.
    [warn] Here are some of the libraries that were evicted:
    [warn]  * com.eed3si9n:sbt-assembly:0.11.2 -> 0.13.0
    [warn] Run 'evicted' to see detailed eviction warnings
    [info] Compiling 5 Scala sources to /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes...
    [error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:23: object Plugin is not a member of package sbtassembly
    [error] import sbtassembly.Plugin._
    [error]                    ^
    [error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:24: not found: object AssemblyKeys
    [error] import AssemblyKeys._
    [error]        ^
    [error] /home/xxxxxx/Development/iAdLearning/spark-cassandra-connector/project/Settings.scala:217: not found: value assemblySettings
    [error]   lazy val sbtAssemblySettings = assemblySettings ++ Seq(
    [error]                                  ^
    [error] three errors found
    [error] (compile:compileIncremental) Compilation failed
    

    我正在运行sbt 0.13.6

    构建Spark java连接器需要在plugins.bat中定义的sbt程序集版本0.11.2。可能是您在全局插件文件夹(~.sbt\0.13\plugins)中安装了较新的sbt程序集版本(版本0.13.0),这导致了此问题。
    请重命名~.sbt\0.13中的plugins文件夹,并尝试重新构建它。

    您可以通过运行来使用打包的sbt

    ./sbt/sbt assembly
    
    这将自动下载并使用有效版本的sbt