Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/320.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Spark应用程序出现错误:org.apache.Spark.sparkeexception:无法获取广播\u 1的广播\u片段0_Java_Apache Spark_Cassandra_Rdd - Fatal编程技术网

Java Spark应用程序出现错误:org.apache.Spark.sparkeexception:无法获取广播\u 1的广播\u片段0

Java Spark应用程序出现错误:org.apache.Spark.sparkeexception:无法获取广播\u 1的广播\u片段0,java,apache-spark,cassandra,rdd,Java,Apache Spark,Cassandra,Rdd,我想实现一个Spark Java应用程序,它在一个表上使用一些过滤器执行select,并将结果与另一个表连接起来 我使用RDD制作了一个演示应用程序,它可以在本地环境(LocalSpark和Cassandra)上完美地工作 但是,当我将应用程序放在带有远程Cassandra数据库的Spark群集上时,它给了我以下错误: Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage fail

我想实现一个Spark Java应用程序,它在一个表上使用一些过滤器执行select,并将结果与另一个表连接起来

我使用RDD制作了一个演示应用程序,它可以在本地环境(LocalSpark和Cassandra)上完美地工作

但是,当我将应用程序放在带有远程Cassandra数据库的Spark群集上时,它给了我以下错误:

Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 17 in stage 1.0 failed 4 times, most recent failure: Lost task 17.3 in stage 1.0 (TID 64, 53.55.75.24
3, executor 3): java.io.IOException: org.apache.spark.SparkException: Failed to get broadcast_1_piece0 of broadcast_1
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1333)
        at org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:207)
        at org.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:66)
        at org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:66)
        at org.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:96)
        at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:84)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)
        at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Failed to get broadcast_1_piece0 of broadcast_1
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply$mcVI$sp(TorrentBroadcast.scala:179)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply(TorrentBroadcast.scala:151)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$org$apache$spark$broadcast$TorrentBroadcast$$readBlocks$1.apply(TorrentBroadcast.scala:151)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1$$anonfun$apply$2.apply(TorrentBroadcast.scala:231)51)
        at org.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:211)
        ... 13 morehe.spark.util.Utils$.tryOrIOException(Utils.scala:1326)
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
Driver sat org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)er.scala:1876)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)0)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)ve(DAGScheduler.scala:2048)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)cala:737)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
        at org.apache.spark.rdd.RDD.count(RDD.scala:1168)ext.scala:2126)
        at org.apache.spark.api.java.AbstractJavaRDDLike.count(JavaRDDLike.scala:45)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.lang.reflect.Method.invoke(Method.java:498)DelegatingMethodAccessorImpl.java:43)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933).scala:924)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1333)
Caused bat org.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:207)broadcast_1
        at org.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:66)t.scala:66)
        at org.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)oadcast.scala:96)
        at org.apache.spark.scheduler.Task.run(Task.scala:121)tTask.scala:84)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)xecutor.scala:408)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.lang.Thread.run(Thread.java:748)or$Worker.run(ThreadPoolExecutor.java:624)
我将在此附上代码和配置类:

pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <version>0.0.2</version>

    <properties>
        <java.version>1.8</java.version>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <spark-core.version>2.4.3</spark-core.version>
        <spark-sql.version>2.4.3</spark-sql.version>
        <spark-cassandra-connector.version>2.4.1</spark-cassandra-connector.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.11</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>${spark-sql.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>${spark-core.version}</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.11</artifactId>
            <version>2.4.1</version>
        </dependency>
    </dependencies>
        <build>
        <plugins>
            <plugin>
                <version>3.5.1</version>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>


       </build>
</project>
local-config.properties

# local, dev
application.profile=dev
# Cassandra connection properties
cassandra.configuration.contactPoint: 53.118.16.40
cassandra.configuration.keyspace: test
cassandra.configuration.port: 9042
cassandra.consistency.level = ONE
cassandra.configuration.user = 
cassandra.configuration.password = 

spark.app.name=App Name
spark.master.url=local[2]
spark.master.port=7077
cassandra.configuration.contactPoint: 53.55.79.246
cassandra.configuration.keyspace: test
cassandra.configuration.port: 9042
cassandra.consistency.level = QUORUM
spark.app.name= App Name
spark.master.url= spark://53.55.75.246:7077
spark.master.port = 7077
dev-config.properties

# local, dev
application.profile=dev
# Cassandra connection properties
cassandra.configuration.contactPoint: 53.118.16.40
cassandra.configuration.keyspace: test
cassandra.configuration.port: 9042
cassandra.consistency.level = ONE
cassandra.configuration.user = 
cassandra.configuration.password = 

spark.app.name=App Name
spark.master.url=local[2]
spark.master.port=7077
cassandra.configuration.contactPoint: 53.55.79.246
cassandra.configuration.keyspace: test
cassandra.configuration.port: 9042
cassandra.consistency.level = QUORUM
spark.app.name= App Name
spark.master.url= spark://53.55.75.246:7077
spark.master.port = 7077
我设法找到了问题所在。当我在JavaRDD中从Cassandra检索数据时,一切正常

问题是当我在包含来自Cassandra的数据的JavaRDD上应用转换操作时。在我的例子中,我应用了keyBy()

在没有keyBy()的情况下,在我调用RDD上的collect()或count()方法后,它在带有远程Cassandra数据库的Spark集群上也可以正常工作

我真的非常感谢你的帮助。。我尝试了我能找到的一切,但没有一个解决办法奏效

我还从集群在Spark master上安装了一个本地Cassandra数据库,试图重现我在开发PC上的本地环境,但它给了我相同的错误。我猜这是Spark配置错误

运行应用程序的命令:

  /spark-2.4.3-bin-hadoop2.7/bin/spark-submit \
 --class com.spark.example.Main \
 --master spark://53.55.75.246:7077  \
 --executor-memory 1G \
 --executor-cores 5 \
 --conf spark.executor.userClassPathFirst=true \
 --conf spark.driver.userClasspathFirst=true \
 --verbose \
 spark-example.jar

您能为
配置属性
设置变量吗??也许问题就在这里。@KenrySanchez我添加了属性文件。我将尝试重现这种行为。我希望能给你一些想法:看看这个@KenrySanchez,我设法消除了这个错误。。我使用了JavaSparkContext,而不是Scala中的简单SparkContext。。。我设法从Cassandra获取数据,并在JavaRDD上执行一个过滤器。在我通过过滤器操作对给定的JavaRDD应用.mapToPair()后,它再次崩溃。。我认为这是Scala的问题。。我不知道。。
# Cassandra connection properties
cassandra.configuration.contactPoint: 53.118.16.40
cassandra.configuration.keyspace: test
cassandra.configuration.port: 9042
cassandra.consistency.level = ONE
cassandra.configuration.user = 
cassandra.configuration.password = 

spark.app.name=App Name
spark.master.url=local[2]
spark.master.port=7077
cassandra.configuration.contactPoint: 53.55.79.246
cassandra.configuration.keyspace: test
cassandra.configuration.port: 9042
cassandra.consistency.level = QUORUM
spark.app.name= App Name
spark.master.url= spark://53.55.75.246:7077
spark.master.port = 7077
  /spark-2.4.3-bin-hadoop2.7/bin/spark-submit \
 --class com.spark.example.Main \
 --master spark://53.55.75.246:7077  \
 --executor-memory 1G \
 --executor-cores 5 \
 --conf spark.executor.userClassPathFirst=true \
 --conf spark.driver.userClasspathFirst=true \
 --verbose \
 spark-example.jar