Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/maven/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 原因:java.lang.ClassNotFoundException:org.jets3t.service.ServiceException_Scala_Maven_Amazon Web Services_Apache Spark_Amazon S3 - Fatal编程技术网

Scala 原因:java.lang.ClassNotFoundException:org.jets3t.service.ServiceException

Scala 原因:java.lang.ClassNotFoundException:org.jets3t.service.ServiceException,scala,maven,amazon-web-services,apache-spark,amazon-s3,Scala,Maven,Amazon Web Services,Apache Spark,Amazon S3,我的代码应该访问存储在S3上的一些文件(该代码在一台机器上运行正常,而在另一台机器上运行失败;基本上,当它从Intellij IDEA本地执行(而不是在集群上)时失败): 我在var df=sqlContext.read.json(“s3n://myPath/*.json”)行中得到以下错误: 我读过关于这个问题的类似文章,有人提到在使用Spark 1.6.2的情况下,解决方案是使用org.apache.hadoop hadoop aws 2.6.0。就我而言,这并没有解决问题 Mypom.xm

我的代码应该访问存储在S3上的一些文件(该代码在一台机器上运行正常,而在另一台机器上运行失败;基本上,当它从Intellij IDEA本地执行(而不是在集群上)时失败):

我在
var df=sqlContext.read.json(“s3n://myPath/*.json”)
行中得到以下错误:

我读过关于这个问题的类似文章,有人提到在使用Spark 1.6.2的情况下,解决方案是使用
org.apache.hadoop hadoop aws 2.6.0
。就我而言,这并没有解决问题

My
pom.xml
(摘录):


UTF-8
UTF-8
1.8
2.10.6
1.6.2
2.8.3
org.scala-lang
scala图书馆
${scala.version}
org.apache.spark
spark-2.10
${spark.version}
org.apache.spark
spark-streaming-kafka_2.10
${spark.version}
org.apache.spark
spark-sql_2.10
${spark.version}
org.apache.spark
spark-mllib_2.10
${spark.version}
com.fasterxml.jackson.module
jackson-module-scala_2.10
${jackson.version}
com.fasterxml.jackson.core
杰克逊数据绑定
${jackson.version}
com.fasterxml.jackson.core
杰克逊注释
${jackson.version}
com.fasterxml.jackson.core
杰克逊核心
${jackson.version}
com.lambdaworks
千斤顶2.10
2.3.3
com.typesafe
配置
1.3.1
org.apache.hadoop
hadoop aws
2.6.0
亚马逊网站
aws-java-sdk-s3
1.11.53
net.debasishg
再贴现率2.10
3.3

依赖项中添加以下内容应该可以解决问题

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.6.0</version>
</dependency>

org.apache.hadoop
hadoop客户端
2.6.0

我希望这有帮助

事实上,我刚刚通过添加
net.java.dev.jets3t jets3t 0.9.4
解决了这个问题。但是,您的方法也有效。
Caused by: java.lang.ClassNotFoundException: org.jets3t.service.ServiceException
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
<properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>

        <java.version>1.8</java.version>
        <scala.version>2.10.6</scala.version>
        <spark.version>1.6.2</spark.version>
        <jackson.version>2.8.3</jackson.version>
    </properties>

<dependencies>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <!--<scope>provided</scope>-->
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka_2.10</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <!--<scope>provided</scope>-->
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.10</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.module</groupId>
            <artifactId>jackson-module-scala_2.10</artifactId>
            <version>${jackson.version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>${jackson.version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-annotations</artifactId>
            <version>${jackson.version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-core</artifactId>
            <version>${jackson.version}</version>
        </dependency>
        <dependency>
            <groupId>com.lambdaworks</groupId>
            <artifactId>jacks_2.10</artifactId>
            <version>2.3.3</version>
        </dependency>
        <dependency>
            <groupId>com.typesafe</groupId>
            <artifactId>config</artifactId>
            <version>1.3.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-aws</artifactId>
            <version>2.6.0</version>
        </dependency>
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-s3</artifactId>
            <version>1.11.53</version>
        </dependency>
        <dependency>
            <groupId>net.debasishg</groupId>
            <artifactId>redisclient_2.10</artifactId>
            <version>3.3</version>
        </dependency>
    </dependencies>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.6.0</version>
</dependency>