Java Spark 1.5.1,Cassandra连接器1.5.0-M2,Cassandra 2.1,Scala 2.10,无此方法错误番石榴依赖性

Java Spark 1.5.1,Cassandra连接器1.5.0-M2,Cassandra 2.1,Scala 2.10,无此方法错误番石榴依赖性,java,scala,maven,apache-spark,spark-cassandra-connector,Java,Scala,Maven,Apache Spark,Spark Cassandra Connector,对于Spark环境来说是新的(对于Maven来说也是相当新的),所以我正在努力解决如何正确发送所需的依赖项的问题 看起来Spark 1.5.1有一个guava-14.0.1依赖项,它试图使用它,而iPrimitive是在15+中添加的。什么是确保我的uber jar获胜的正确方法?我在spark-defaults.conf中尝试了spark.executor.extraClassPath,但没有成功 重复此[问题]:但对于Maven来说本质上是(还没有代表评论) 将我的依赖性剥离到以下内容:

对于Spark环境来说是新的(对于Maven来说也是相当新的),所以我正在努力解决如何正确发送所需的依赖项的问题

看起来Spark 1.5.1有一个guava-14.0.1依赖项,它试图使用它,而iPrimitive是在15+中添加的。什么是确保我的uber jar获胜的正确方法?我在spark-defaults.conf中尝试了
spark.executor.extraClassPath
,但没有成功

重复此[问题]:但对于Maven来说本质上是(还没有代表评论)

将我的依赖性剥离到以下内容:

    <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>18.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.commons</groupId>
        <artifactId>commons-compress</artifactId>
        <version>1.10</version>
    </dependency>
    <dependency>
        <groupId>com.esotericsoftware.kryo</groupId>
        <artifactId>kryo</artifactId>
        <version>2.21</version>
    </dependency>
    <dependency>
        <groupId>org.objenesis</groupId>
        <artifactId>objenesis</artifactId>
        <version>2.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.5.0</version>
        <exclusions>
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
            </exclusion>
            <exclusion>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.5.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.10</artifactId>
        <version>1.5.0-M2</version>
    </dependency>

番石榴
番石榴
18
org.apache.commons
公用压缩
1.10
com.esotericsoftware.kryo
克鲁约
2.21
org.objenesis
正视
2.1
org.apache.spark
spark-core_2.10
1.5.0
org.slf4j
slf4j-log4j12
log4j
log4j
org.apache.spark
spark-sql_2.10
1.5.0
com.datasax.spark
spark-cassandra-connector_2.10
1.5.0-M2
使用以下方法为我的JAR添加所有依赖项:

       <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <artifactSet>
                            <excludes>
                                <exclude>org.apache.hadoop:*</exclude>
                                <exclude>org.apache.hbase:*</exclude>
                            </excludes>
                        </artifactSet>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                            <filter>
                                <artifact>org.apache.spark:spark-network-common_2.10</artifact>
                                <excludes>
                                    <exclude>com.google.common.base.*</exclude>
                                </excludes>
                            </filter>
                        </filters>
                        <transformers>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                <!-- merge multiple reference.conf files into one -->
                                <resource>reference.conf</resource>
                            </transformer>
                        </transformers>
                    </configuration>
                </execution>
            </executions>
        </plugin>

org.apache.maven.plugins
maven阴影插件
2.3
包裹
阴凉处
org.apache.hadoop:*
org.apache.hbase:*
*:*
META-INF/*.SF
META-INF/*.DSA
META-INF/*.RSA
spark:spark-network-common_2.10
com.google.common.base*
reference.conf
这是我跑步时的惊人爆发

/spark submit--master local--class

线程“main”java.lang.NoSuchMethodError:com.google.common.reflect.TypeToken.isPrimitive()Z中的异常 位于com.datastax.driver.core.TypeCodec。(TypeCodec.java:142) 位于com.datastax.driver.core.TypeCodec.(TypeCodec.java:136) 位于com.datastax.driver.core.TypeCodec$BlobCodec.(TypeCodec.java:609) 位于com.datasax.driver.core.TypeCodec$BlobCodec.(TypeCodec.java:606) 位于com.datastax.driver.core.CodecRegistry。(CodecRegistry.java:147) 位于com.datasax.driver.core.Configuration$Builder.build(Configuration.java:259) 位于com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135) 位于com.datastax.driver.core.Cluster.(Cluster.java:111) 位于com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178) 位于com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152) 在com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)上 在com.datasax.spark.connector.cql.CassandraConnector$.com$datasax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
通过在/conf/spark-defaults.conf中包含我需要的番石榴酱,修复了我的依赖性问题

spark.driver.extraClassPath /home/osboxes/Packages/guava-18.0.jar
spark.executor.extraClassPath /home/osboxes/Packages/guava-18.0.jar

+1谢谢,但修复后,我在线程“main”java.lang.IllegalAccessError中遇到了一个新的错误异常:试图访问方法com.google.common.collect.MapMaker.softValues()Lcom/google/common/collect/MapMaker;来自org.apache.spark.SparkEnv类。你面对过这个问题吗?比给.jar着色容易多了
spark.driver.extraClassPath /home/osboxes/Packages/guava-18.0.jar
spark.executor.extraClassPath /home/osboxes/Packages/guava-18.0.jar