Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 多项目sbt组装问题_Scala_Apache Spark_Sbt_Sbt Assembly - Fatal编程技术网

Scala 多项目sbt组装问题

Scala 多项目sbt组装问题,scala,apache-spark,sbt,sbt-assembly,Scala,Apache Spark,Sbt,Sbt Assembly,我正在尝试创建一个包含两个主要类的项目——SparkConsumer和KafkaProducer。为此,我在sbt文件中引入了多项目结构。消费者和生产者模块用于单独的项目,核心项目包含生产者和消费者都使用的UTIL。Root是主项目。还介绍了常见设置和库依赖项。但是,由于某些原因,该项目没有编译。所有sbt组件相关设置均标记为红色。但是,已定义sbt程序集插件的plugins.sbt位于根项目中 这个问题的解决办法是什么 项目结构如下所示: 以下是build.sbt文件: lazy val o

我正在尝试创建一个包含两个主要类的项目——SparkConsumer和KafkaProducer。为此,我在sbt文件中引入了多项目结构。消费者和生产者模块用于单独的项目,核心项目包含生产者和消费者都使用的UTIL。Root是主项目。还介绍了常见设置和库依赖项。但是,由于某些原因,该项目没有编译。所有sbt组件相关设置均标记为红色。但是,已定义sbt程序集插件的plugins.sbt位于根项目中

这个问题的解决办法是什么

项目结构如下所示:

以下是build.sbt文件:

lazy val overrides = Seq("com.fasterxml.jackson.core" % "jackson-core" % "2.9.5",
  "com.fasterxml.jackson.core" % "jackson-databind" % "2.9.5",
  "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.5")

lazy val commonSettings = Seq(
  name := "Demo",
  version := "0.1",
  scalaVersion := "2.11.8",
  resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven",
  dependencyOverrides += overrides
)

lazy val assemblySettings = Seq(
  assemblyMergeStrategy in assembly := {
    case PathList("org","aopalliance", xs @ _*) => MergeStrategy.last
    case PathList("javax", "inject", xs @ _*) => MergeStrategy.last
    case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
    case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
    case PathList("org", "apache", xs @ _*) => MergeStrategy.last
    case PathList("com", "google", xs @ _*) => MergeStrategy.last
    case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
    case PathList("com", "codahale", xs @ _*) => MergeStrategy.last
    case PathList("com", "yammer", xs @ _*) => MergeStrategy.last
    case PathList("org", "slf4j", xs @ _*) => MergeStrategy.last
    case PathList("org", "neo4j", xs @ _*) => MergeStrategy.last
    case PathList("com", "typesafe", xs @ _*) => MergeStrategy.last
    case PathList("net", "jpountz", xs @ _*) => MergeStrategy.last
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard
    case "about.html" => MergeStrategy.rename
    case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last
    case "META-INF/mailcap" => MergeStrategy.last
    case "META-INF/mimetypes.default" => MergeStrategy.last
    case "plugin.properties" => MergeStrategy.last
    case "log4j.properties" => MergeStrategy.last
    case x =>
      val oldStrategy = (assemblyMergeStrategy in assembly).value
      oldStrategy(x)
  }
)

val sparkVersion = "2.2.0"

lazy val commonDependencies = Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion,
  "neo4j-contrib" % "neo4j-spark-connector" % "2.1.0-M4",
  "com.typesafe" % "config" % "1.3.0",
  "org.neo4j.driver" % "neo4j-java-driver" % "1.5.1",
  "com.opencsv" % "opencsv" % "4.1",
  "com.databricks" %% "spark-csv" % "1.5.0",
  "com.github.tototoshi" %% "scala-csv" % "1.3.5",
  "org.elasticsearch" %% "elasticsearch-spark-20" % "6.2.4"
)

lazy val root = (project in file("."))
  .settings(
    commonSettings,
    assemblySettings,
    libraryDependencies ++= commonDependencies,
    assemblyJarName in assembly := "demo_root.jar"
  )
  .aggregate(core, consumer, producer)


lazy val core = project
  .settings(
    commonSettings,
    assemblySettings,
    libraryDependencies ++= commonDependencies
  )

lazy val consumer = project
  .settings(
    commonSettings,
    assemblySettings,
    libraryDependencies ++= commonDependencies,
    mainClass in assembly := Some("consumer.SparkConsumer"),
    assemblyJarName in assembly := "demo_consumer.jar"
  )
  .dependsOn(core)

lazy val producer = project
  .settings(
    commonSettings,
    assemblySettings,
    libraryDependencies ++= commonDependencies,
    mainClass in assembly := Some("producer.KafkaCheckinsProducer"),
    assemblyJarName in assembly := "demo_producer.jar"
  )
  .dependsOn(core)
更新:堆栈跟踪

(producer / update) java.lang.IllegalArgumentException: a module is not authorized to depend on itself: demo#demo_2.11;0.1
[error] (consumer / update) java.lang.IllegalArgumentException: a module is not authorized to depend on itself: demo#demo_2.11;0.1
[error] (core / Compile / compileIncremental) Compilation failed
[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-sql_2.12;2.2.0: not found
[error] unresolved dependency: org.apache.spark#spark-streaming_2.12;2.2.0: not found
[error] unresolved dependency: org.apache.spark#spark-streaming-kafka-0-10_2.12;2.2.0: not found
[error] unresolved dependency: com.databricks#spark-csv_2.12;1.5.0: not found
[error] unresolved dependency: org.elasticsearch#elasticsearch-spark-20_2.12;6.2.4: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;2.2.0: not found
未解决的依赖关系:org.apache.spark#spark-sql_2.12;2.2.0

Spark 2.2.0需要Scala 2.11,请参阅 由于某些原因,您的共同设定中的反对意见不适用。您可能需要设置全局scalaversation来解决这个问题

Spark在Java8+、Python2.7+/3.4+和R3.1+上运行。对于Scala API, Spark 2.2.0使用Scala 2.11。您需要使用兼容的Scala 版本(2.11.x)


此外,spark sql和spark streaming也应标记为“已提供”

是否可以使用编译错误进行更新?