无法运行从Scala文件创建的jar文件
这是我用Scala编写的代码无法运行从Scala文件创建的jar文件,scala,jar,build,sbt,Scala,Jar,Build,Sbt,这是我用Scala编写的代码 object Main extends App { println("Hello World from Scala!") } 这是我的身材 这是我用来创建jar文件的命令 sbt package 我的问题是在target/scala-2.11上创建了一个名为hello-world_2.11-1.0.jar的jar文件。但我无法运行该文件。说NoClassDefFoundError给了我一个错误 我做错了什么?它还说明找不到什么类。很可
object Main extends App {
println("Hello World from Scala!")
}
这是我的身材
这是我用来创建jar文件的命令
sbt package
我的问题是在target/scala-2.11上创建了一个名为hello-world_2.11-1.0.jar的jar文件。但我无法运行该文件。说NoClassDefFoundError给了我一个错误
我做错了什么?它还说明找不到什么类。很可能您没有包括
scala library.jar
。如果您在命令行或java-cp:target/scala-2.11/hello-world\u 2.11-1.0.jar中有scala 2.11可用,则可以运行scala-target/scala-2.11/hello-world\u 2.11-1.0.jar“Main
(在Windows上使用;
而不是:
)。所描述的过程在执行jar文件的方式上都是有效的。从target/scala-2.11
尝试使用
scala hello-world_2.11-1.0.jar
使用sbt run
检查它是否也可以从项目根文件夹运行。要使用多个主类运行jar文件(包含scala代码),请使用以下方法
scala -cp "<jar-file>.jar;<other-dependencies>.jar" com.xyz.abc.TestApp
scala-cp.jar;.jar“com.xyz.abc.TestApp
此命令将负责将scala-library.jar包含在依赖项中,并且如果TestApp具有defmain(args:Array[String])
方法,则还将其标识为main类。请注意,多个jar文件应该用分号(;)分隔我们可以使用sbt assembly
打包并运行应用程序
首先,创建插件或将其添加到project/plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.9")
示例build.sbt
如下所示:
name := "coursera"
version := "0.1"
scalaVersion := "2.12.10"
mainClass := Some("Main")
val sparkVersion = "3.0.0-preview2"
val playVersion="2.8.1"
val jacksonVersion="2.10.1"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-library" % scalaVersion.toString(),
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"com.typesafe.play" %% "play-json" % playVersion,
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10
"org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion,
// https://mvnrepository.com/artifact/org.mongodb/casbah
"org.mongodb" %% "casbah" % "3.1.1" pomOnly(),
// https://mvnrepository.com/artifact/com.typesafe/config
"com.typesafe" % "config" % "1.2.1"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
从控制台,我们可以运行sbt assembly
,并在target/scala-2.12/
路径中创建jar文件
sbt程序集
将创建一个胖jar。以下是文件的摘录:
sbt assembly是一个sbt插件,最初是从codahale的assembly sbt移植而来的,我猜它的灵感来自Maven的assembly插件。目标很简单:创建一个包含所有依赖项的胖罐子
我已经在根文件夹中复制了scala-library.jar。因此,现在我尝试运行以下命令java-cp scala-library.jar-jar target/scala-2.11/hello-world_2.11-1.0.jar
。我收到了相同的错误。很抱歉,它应该是java-cp“scala-library.jar:target/scala-2.11/hello-world_2.11-1.0.jar”Main
(请参阅,在Windows上使用;
而不是:
)。是否可以将scala-library.jar文件包含到hello-world jar文件中?或者我应该为此问一个新问题?
name := "coursera"
version := "0.1"
scalaVersion := "2.12.10"
mainClass := Some("Main")
val sparkVersion = "3.0.0-preview2"
val playVersion="2.8.1"
val jacksonVersion="2.10.1"
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-library" % scalaVersion.toString(),
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"com.typesafe.play" %% "play-json" % playVersion,
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10
"org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion,
// https://mvnrepository.com/artifact/org.mongodb/casbah
"org.mongodb" %% "casbah" % "3.1.1" pomOnly(),
// https://mvnrepository.com/artifact/com.typesafe/config
"com.typesafe" % "config" % "1.2.1"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}