Java 如何在jar中使用主类激发提交?

Java 如何在jar中使用主类激发提交?,java,scala,apache-spark,Java,Scala,Apache Spark,关于ClassNotFoundException有很多问题,但我还没有看到任何(目前)符合这个具体情况的问题。我正在尝试运行以下命令: spark提交--master local[*]--class com.bundle.HelloWorld scala-ts.jar 它引发以下异常: \u@\h:\w$ spark_submit --class com.stronghold.HelloWorld scala-ts.jar

关于
ClassNotFoundException
有很多问题,但我还没有看到任何(目前)符合这个具体情况的问题。我正在尝试运行以下命令:

spark提交--master local[*]--class com.bundle.HelloWorld scala-ts.jar

它引发以下异常:

\u@\h:\w$ spark_submit --class com.stronghold.HelloWorld scala-ts.jar                                                                                                                                                                                                                                                                               ⬡ 9.8.0 [±master ●●●] 
2018-05-06 19:52:33 WARN  Utils:66 - Your hostname, asusTax resolves to a loopback address: 127.0.1.1; using 192.168.1.184 instead (on interface p1p1)                               
2018-05-06 19:52:33 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address                                                                                       
2018-05-06 19:52:33 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable                                
java.lang.ClassNotFoundException: com.stronghold.HelloWorld                               
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)                     
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)                          
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)                          
        at java.lang.Class.forName0(Native Method)                                        
        at java.lang.Class.forName(Class.java:348)                                        
        at org.apache.spark.util.Utils$.classForName(Utils.scala:235)                     
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:836)                                                                  
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)        
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)             
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)               
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)                    
2018-05-06 19:52:34 INFO  ShutdownHookManager:54 - Shutdown hook called                   
2018-05-06 19:52:34 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-e8a77988-d30c-4e96-81fe-bcaf5d565c75
但是,jar显然包含以下类:

1     " zip.vim version v28                                                                                                                                                                                                                                                                                                                                               
    1 " Browsing zipfile /home/[USER]/projects/scala_ts/out/artifacts/TimeSeriesFilter_jar/scala-ts.jar
    2 " Select a file with cursor and press ENTER
    3  
    4 META-INF/MANIFEST.MF
    5 com/
    6 com/stronghold/
    7 com/stronghold/HelloWorld$.class
    8 com/stronghold/TimeSeriesFilter$.class
    9 com/stronghold/DataSource.class
   10 com/stronghold/TimeSeriesFilter.class
   11 com/stronghold/HelloWorld.class
   12 com/stronghold/scratch.sc
   13 com/stronghold/HelloWorld$delayedInit$body.class
通常情况下,这里的挂起是在文件结构上,但我很确定这在这里是正确的:

../
scala_ts/
| .git/
| .idea/
| out/
| | artifacts/
| | | TimeSeriesFilter_jar/
| | | | scala-ts.jar
| src/
| | main/
| | | scala/
| | | | com/
| | | | | stronghold/
| | | | | | DataSource.scala
| | | | | | HelloWorld.scala
| | | | | | TimeSeriesFilter.scala
| | | | | | scratch.sc
| | test/
| | | scala/
| | | | com/
| | | | | stronghold/
| | | | | | AppTest.scala
| | | | | | MySpec.scala                                                                                                                                                                                                                                                                                                                                                  
| target/
| README.md
| pom.xml
我在工作中运行过其他具有相同结构的作业(因此,不同的环境)。我现在正试图通过一个家庭项目获得更多的设施,但这似乎是一个早期的障碍

简言之,我是不是错过了一些显而易见的东西

附录

对于感兴趣的人,以下是我的pom:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.stronghold</groupId>
  <artifactId>scala-ts</artifactId>
  <version>1.0-SNAPSHOT</version>
  <inceptionYear>2008</inceptionYear>
  <properties>
    <scala.version>2.11.8</scala.version>
  </properties>

  <repositories>
    <repository>
      <id>scala-tools.org</id>
      <name>Scala-Tools Maven2 Repository</name>
      <url>http://scala-tools.org/repo-releases</url>
    </repository>
  </repositories>

  <pluginRepositories>
    <pluginRepository>
      <id>scala-tools.org</id>
      <name>Scala-Tools Maven2 Repository</name>
      <url>http://scala-tools.org/repo-releases</url>
    </pluginRepository>
  </pluginRepositories>

  <dependencies>
    <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <version>2.11.8</version>
    </dependency>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.9</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.scala-tools.testing</groupId>
      <artifactId>specs_2.10</artifactId>
      <version>1.6.9</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
      <version>2.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-sql_2.11</artifactId>
      <version>2.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-catalyst_2.11</artifactId>
      <version>2.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>2.7.3</version>
    </dependency>
  </dependencies>

  <build>
    <sourceDirectory>src/main/scala</sourceDirectory>
    <testSourceDirectory>src/test/scala</testSourceDirectory>
    <plugins>
      <plugin>
        <groupId>org.scala-tools</groupId>
        <artifactId>maven-scala-plugin</artifactId>
        <executions>
          <execution>
            <goals>
              <goal>compile</goal>
              <goal>testCompile</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <scalaVersion>${scala.version}</scalaVersion>
          <args>
            <arg>-target:jvm-1.5</arg>
          </args>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-eclipse-plugin</artifactId>
        <configuration>
          <downloadSources>true</downloadSources>
          <buildcommands>
            <buildcommand>ch.epfl.lamp.sdt.core.scalabuilder</buildcommand>
          </buildcommands>
          <additionalProjectnatures>
            <projectnature>ch.epfl.lamp.sdt.core.scalanature</projectnature>
          </additionalProjectnatures>
          <classpathContainers>
            <classpathContainer>org.eclipse.jdt.launching.JRE_CONTAINER</classpathContainer>
            <classpathContainer>ch.epfl.lamp.sdt.launching.SCALA_CONTAINER</classpathContainer>
          </classpathContainers>
        </configuration>
      </plugin>
    </plugins>
  </build>
  <reporting>
    <plugins>
      <plugin>
        <groupId>org.scala-tools</groupId>
        <artifactId>maven-scala-plugin</artifactId>
        <configuration>
          <scalaVersion>${scala.version}</scalaVersion>
        </configuration>
      </plugin>
    </plugins>
  </reporting>
</project>

4.0.0
大本营
鳞片
1.0-快照
2008
2.11.8
scala-tools.org
Scala工具Maven2存储库
http://scala-tools.org/repo-releases
scala-tools.org
Scala工具Maven2存储库
http://scala-tools.org/repo-releases
org.scala-lang
scala图书馆
2.11.8
朱尼特
朱尼特
4.9
测试
org.scala-tools.testing
规格2.10
1.6.9
测试
org.apache.spark
spark-core_2.11
2.2.0
org.apache.spark
spark-sql_2.11
2.2.0
org.apache.spark
spark-catalyst_2.11
2.2.0
org.apache.hadoop
hadoop通用
2.7.3
src/main/scala
src/test/scala
org.scala-tools
maven scala插件
编译
测试编译
${scala.version}
-目标:jvm-1.5
org.apache.maven.plugins
maven eclipse插件
真的
ch.epfl.lamp.sdt.core.scalabuilder
ch.epfl.lamp.sdt.core.scalanature
org.eclipse.jdt.launching.JRE_容器
ch.epfl.lamp.sdt.launching.SCALA_集装箱
org.scala-tools
maven scala插件
${scala.version}
更新

为不够清晰而道歉。我从与
.jar
/home/[USER]/projects/scala\u ts/out/artifacts/TimeSeriesFilter\u jar/
)相同的目录中运行了该命令。也就是说,要明确的是,指定完整路径不会改变结果

还应该注意的是,我可以在Intellij中运行HelloWorld,它使用相同的类引用(
com.bundle.HelloWorld
)。

查看pom文件,您引用的jar文件与pom文件中的jar文件不匹配


尝试添加maven shade插件并构建>运行

这是参考资料,可能会对你有所帮助


为什么不使用jar文件的路径,以便
spark submit
(与任何其他命令行工具一样)可以找到并使用它

给定路径
out/artifacts/TimeSeriesFilter\u jar/scala-ts.jar
我将使用以下方法:

spark-submit --class com.stronghold.HelloWorld out/artifacts/TimeSeriesFilter_jar/scala-ts.jar
请注意,您应该位于项目的主目录中,该目录似乎是
/home/[USER]/projects/scala\ts


还请注意,我删除了
--master local[*]
,因为这是spark submit使用的默认主URL。

恐怕这些都不是问题所在。我以前尝试过删除项目中的所有内容,然后重新开始,但这也不起作用。一旦我想到要开始一个完全不同的项目,它就工作得很好。显然Intellij(我是其粉丝)决定在某个地方制造一个隐藏的问题。

您介意详细说明为什么这是有用的吗?我不必在其他环境中使用uber罐子。我再次清洁并包装了罐子,但我担心这没有什么不同。至于引用一个旧jar,我只为这个项目创建了一个。为了安全起见,我删除了这个罐子,然后从头开始构建一个新的。不幸的是,没有骰子。你检查了项目目标文件夹中的jar文件名了吗?对不起,我这周忙得不可开交。我刚刚回到这个话题。答案是肯定的,jar名称是正确的。你说的正确是什么意思?你能用完整路径共享jar文件名吗?我的意思是没有其他jar。我唯一构建的是:
。/scala\u ts/out/artifacts/TimeSeriesFilter\u jar/scala-ts.jar
。此外,目标文件夹中没有jar,它们在构建时位于out文件夹中。
1. clean the project and package again
2. make sure the jar file name by going to target folder of the project
3. you can give the exact path to the target folder to point to the jar when you apply spark-submit command
spark-submit --class com.stronghold.HelloWorld out/artifacts/TimeSeriesFilter_jar/scala-ts.jar