Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/cassandra/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark Get-TransportException:使用结构化流式读写数据到Cassandra时,连接已关闭_Apache Spark_Cassandra_Spark Structured Streaming_Spark Cassandra Connector - Fatal编程技术网

Apache spark Get-TransportException:使用结构化流式读写数据到Cassandra时,连接已关闭

Apache spark Get-TransportException:使用结构化流式读写数据到Cassandra时,连接已关闭,apache-spark,cassandra,spark-structured-streaming,spark-cassandra-connector,Apache Spark,Cassandra,Spark Structured Streaming,Spark Cassandra Connector,我使用结构化流媒体来读取数据并保存到cassandra。火花卡桑德拉连接器:使用2.0.10驱动器。该应用程序一开始运行正常,但可能每20小时就会暂停一次,例外情况如下: java.io.IOException: Exception during execution of SELECT "is_delete", "region_code", "gateway_id", "purpose_code", "ad

我使用结构化流媒体来读取数据并保存到cassandra。火花卡桑德拉连接器:使用2.0.10驱动器。该应用程序一开始运行正常,但可能每20小时就会暂停一次,例外情况如下:

java.io.IOException: Exception during execution of SELECT "is_delete", "region_code", "gateway_id", "purpose_code", "address_code" FROM "hk_ods"."frm_firealarm_gateway" WHERE token("gateway_id") > ? AND token("gateway_id") <= ?   ALLOW FILTERING: [node1-fzy/192.168.1.191:9042] Connection has been closed
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.com$datastax$spark$connector$rdd$CassandraTableScanRDD$$fetchTokenRange(CassandraTableScanRDD.scala:350)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$17.apply(CassandraTableScanRDD.scala:367)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD$$anonfun$17.apply(CassandraTableScanRDD.scala:367)
    at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
    at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
    at com.datastax.spark.connector.util.CountingIterator.hasNext(CountingIterator.scala:12)
    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
    at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
    at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:395)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:234)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:228)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:827)
    at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
    at org.apache.spark.scheduler.Task.run(Task.scala:108)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
    at java.lang.Thread.run(Unknown Source)
Caused by: com.datastax.driver.core.exceptions.TransportException: [node1-fzy/192.168.1.191:9042] Connection has been closed
    at com.datastax.driver.core.exceptions.TransportException.copy(TransportException.java:38)
    at com.datastax.driver.core.exceptions.TransportException.copy(TransportException.java:24)
    at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:37)
    at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:245)
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:68)
    at sun.reflect.GeneratedMethodAccessor46.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40)
    at com.sun.proxy.$Proxy13.execute(Unknown Source)
    at com.datastax.spark.connector.cql.DefaultScanner.scan(Scanner.scala:34)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.com$datastax$spark$connector$rdd$CassandraTableScanRDD$$fetchTokenRange(CassandraTableScanRDD.scala:342)
    ... 22 more
Caused by: com.datastax.driver.core.exceptions.TransportException: [node1-fzy/192.168.1.191:9042] Connection has been closed
    at com.datastax.driver.core.Connection$ConnectionCloseFuture.force(Connection.java:1210)
    at com.datastax.driver.core.Connection$ConnectionCloseFuture.force(Connection.java:1195)
    at com.datastax.driver.core.Connection.defunct(Connection.java:445)
    at com.datastax.driver.core.Connection$Dispatcher.exceptionCaught(Connection.java:1128)
    at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:281)
    at io.netty.channel.AbstractChannelHandlerContext.notifyHandlerException(AbstractChannelHandlerContext.java:844)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    ... 1 more
它显示一个onde已关闭连接,但每个节点都在不关闭的情况下运行。 如果重新启动,它可能会再运行18到20个小时,然后发生相同的异常。
有人能帮忙吗?非常感谢

什么是Spark版本?尝试使用一致性级别的本地仲裁读取数据,或者至少使用两个本地仲裁读取数据-这可能有助于减少服务器上的负载。而且,这种查询在使用结构化streamingSpark2.2时并不能很好地使用。使用默认的读取一致性级别LOCAL_QUORUM,有时会出现本地读取异常异常,因此我将该级别更改为1。我们的应用程序包含许多要完成的分离,而kafka的原始数据仅包含一些关键数据。如果不使用结构化流连接静态表和流,我想我们应该缓存维度数据?如果是这样,我们必须提前缓存所有维度数据,但需求总是随着时间而变化,因此我无法想出更好的方法来实现这一目标。您的最佳选择是升级到Spark 2.4,因此,您可以使用Spark Cassandra Connector 2.5,它包含所谓的直接连接,在进行连接时,只允许从Cassandra拉出特定项目。再见&谢谢!我会检查新版本和文档,看看如何改进。什么是Spark版本?尝试使用一致性级别的本地仲裁读取数据,或者至少使用两个本地仲裁读取数据-这可能有助于减少服务器上的负载。而且,这种查询在使用结构化streamingSpark2.2时并不能很好地使用。使用默认的读取一致性级别LOCAL_QUORUM,有时会出现本地读取异常异常,因此我将该级别更改为1。我们的应用程序包含许多要完成的分离,而kafka的原始数据仅包含一些关键数据。如果不使用结构化流连接静态表和流,我想我们应该缓存维度数据?如果是这样,我们必须提前缓存所有维度数据,但需求总是随着时间而变化,因此我无法想出更好的方法来实现这一目标。您的最佳选择是升级到Spark 2.4,因此,您可以使用Spark Cassandra Connector 2.5,它包含所谓的直接连接,在进行连接时,只允许从Cassandra拉出特定项目。再见&谢谢!我将检查新版本和文档,看看如何改进它。