Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/maven/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Spark运行时错误:Spark.metrics.sink.MetricsServlet无法实例化_Java_Maven_Intellij Idea_Apache Spark_Maven 3 - Fatal编程技术网

Java Spark运行时错误:Spark.metrics.sink.MetricsServlet无法实例化

Java Spark运行时错误:Spark.metrics.sink.MetricsServlet无法实例化,java,maven,intellij-idea,apache-spark,maven-3,Java,Maven,Intellij Idea,Apache Spark,Maven 3,我在IntelliJ的maven中使用spark 1.3 lib运行项目时遇到调用目标异常 我只在IntelliJ IDE中遇到这个错误。在我部署jar并通过spark submit运行之后,错误就消失了 以前有人遇到过同样的问题吗?我希望解决这个问题,以便进行简单的调试。否则,每当我想运行代码时,我都必须打包jar 详情如下: 2015-04-21 09:39:13 ERROR MetricsSystem:75 - Sink class org.apache.spark.metrics

我在IntelliJ的maven中使用spark 1.3 lib运行项目时遇到调用目标异常

我只在IntelliJ IDE中遇到这个错误。在我部署jar并通过spark submit运行之后,错误就消失了

以前有人遇到过同样的问题吗?我希望解决这个问题,以便进行简单的调试。否则,每当我想运行代码时,我都必须打包jar

详情如下:

    2015-04-21 09:39:13 ERROR MetricsSystem:75 - Sink class org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized
    2015-04-21 09:39:13 ERROR TrainingSFERunner:144 - java.lang.reflect.InvocationTargetException
    2015-04-20 16:08:44 INFO  BlockManagerMaster:59 - Registered BlockManager
    Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:187)
        at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:181)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
        at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:181)
        at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:98)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:390)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
        at spark.mllibClassifier.JavaRandomForests.run(JavaRandomForests.java:105)
        at spark.mllibClassifier.SparkMLlibMain.runMain(SparkMLlibMain.java:263)
        at spark.mllibClassifier.JavaRandomForests.main(JavaRandomForests.java:221)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
    Caused by: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.module.SimpleSerializers.<init>(Ljava/util/List;)V
        at com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:223)
        at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469)
        at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:45)
2015-04-21 09:39:13错误度量系统:75-无法实例化Sink类org.apache.spark.metrics.Sink.MetricsServlet
2015-04-21 09:39:13错误TrainingsferRunner:144-java.lang.reflect.InvocationTargetException
2015-04-20 16:08:44信息区块管理器管理员:59-注册区块管理器
线程“main”java.lang.reflect.InvocationTargetException中出现异常
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:526)
在org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply上(MetricsSystem.scala:187)
在org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply上(MetricsSystem.scala:181)
位于scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
位于scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
位于scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
位于scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
位于scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
位于org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:181)
位于org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:98)
位于org.apache.spark.SparkContext(SparkContext.scala:390)
位于org.apache.spark.api.java.JavaSparkContext(JavaSparkContext.scala:61)
在spark.mllibClassifier.JavaRandomForests.run(JavaRandomForests.java:105)中
位于spark.mllibClassifier.SparkMLlibMain.runMain(SparkMLlibMain.java:263)
位于spark.mllibClassifier.JavaRandomForests.main(JavaRandomForests.java:221)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)中
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:606)
位于com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
原因:java.lang.NoSuchMethodError:com.fasterxml.jackson.databind.module.SimpleSerializers.(Ljava/util/List;)V
位于com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:223)
位于com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469)
位于org.apache.spark.metrics.sink.MetricsServlet.(MetricsServlet.scala:45)
我的pom文件如下

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>projects</groupId>
    <artifactId>project1</artifactId>
    <version>1.0-SNAPSHOT</version>

    <dependencies>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>3.0</version>
        </dependency>


        <dependency>
            <groupId>org.apache.lucene</groupId>
            <artifactId>lucene-core</artifactId>
            <version>5.0.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.lucene</groupId>
            <artifactId>lucene-analyzers-common</artifactId>
            <version>5.0.0</version>
        </dependency>

        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
        </dependency>

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>3.8.1</version>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>commons-io</groupId>
            <artifactId>commons-io</artifactId>
            <version>2.1</version>
            <scope>test</scope>
        </dependency>



            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.10</artifactId>
                <version>1.3.0</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-mllib_2.10</artifactId>
                <version>1.3.0</version>
            </dependency>


        <dependency>
            <groupId>colt</groupId>
            <artifactId>colt</artifactId>
            <version>1.2.0</version>
        </dependency>


    </dependencies>

    <build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.2</version>
            <configuration>
                <source>1.7</source>
                <target>1.7</target>
            </configuration>
        </plugin>

        <plugin>
            <artifactId>maven-assembly-plugin</artifactId>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>single</goal>
                    </goals>
                </execution>
            </executions>
            <configuration>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
            </configuration>
        </plugin>

    </plugins>
    </build>
</project>

4.0.0
项目
项目1
1.0-快照
org.apache.commons
commons-lang3
3
org.apache.lucene
lucene岩芯
5.0.0
org.apache.lucene
lucene分析仪通用
5.0.0
log4j
log4j
1.2.17
朱尼特
朱尼特
3.8.1
测试
公地io
公地io
2.1
测试
org.apache.spark
spark-core_2.10
1.3.0
org.apache.spark
spark-mllib_2.10
1.3.0
小马
小马
1.2.0
org.apache.maven.plugins
maven编译器插件
3.2
1.7
1.7
maven汇编插件
包裹
单一的
带有依赖项的jar

奇怪的是,当我将与spark相关的依赖项移到前面时,我发现错误不再出现

   <dependencies>
         <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.10</artifactId>
            <version>1.3.0</version>
        </dependency>
        //....the rest dependencies....

    </dependencies>

org.apache.spark
spark-core_2.10
1.3.0
org.apache.spark
spark-mllib_2.10
1.3.0
//……其余的依赖项。。。。

所以依赖关系的顺序很重要!有人知道为什么吗?

奇怪的是,当我将与spark相关的依赖项移到前面时,我发现错误不再出现

   <dependencies>
         <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.10</artifactId>
            <version>1.3.0</version>
        </dependency>
        //....the rest dependencies....

    </dependencies>

org.apache.spark
spark-core_2.10
1.3.0
org.apache.spark
spark-mllib_2.10
1.3.0
//……其余的依赖项。。。。

所以依赖关系的顺序很重要!有人知道为什么吗?

我认为这里的问题在于杰克逊的依赖性。我有类似的问题,问题是多个jackson core和jackson databind版本。我认为这是与scala有关的问题。无论如何,将这2个jackson依赖项添加到具有较低版本的pom中,它应该可以工作。也许你不会从第一次尝试中找到正确的版本。这个适合我

    <jackson-core.version>2.4.4</jackson-core.version>
    <spark.version>2.3.0</spark.version>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.11</artifactId>
        <version>${spark.version}</version>
    </dependency>        
    <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core -->
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-core</artifactId>
        <version>${jackson-core.version}</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        <version>${jackson-core.version}</version>
    </dependency>
2.4.4
2.3.0
org.apache.spark