Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/docker/9.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 不支持Cassandra-SERIAL作为条件更新提交一致性_Scala_Apache Spark_Cassandra_Spark Cassandra Connector - Fatal编程技术网

Scala 不支持Cassandra-SERIAL作为条件更新提交一致性

Scala 不支持Cassandra-SERIAL作为条件更新提交一致性,scala,apache-spark,cassandra,spark-cassandra-connector,Scala,Apache Spark,Cassandra,Spark Cassandra Connector,我确实从Spark(2.0.2)写到Cassandra(3.7): Spark cassandra连接器(版本2.0.0-M3)一致性配置: spark.cassandra.output.consistency.level ONE 它在一段时间内运行良好,但现在开始失败,只有以下例外: ERROR writer.QueryExecutor: Failed to execute: com.datastax.spark.connector.writer.RichBoundStatement@6ad

我确实从Spark(2.0.2)写到Cassandra(3.7):

Spark cassandra连接器(版本2.0.0-M3)一致性配置:

spark.cassandra.output.consistency.level ONE
它在一段时间内运行良好,但现在开始失败,只有以下例外:

ERROR writer.QueryExecutor: Failed to execute: com.datastax.spark.connector.writer.RichBoundStatement@6ad8868c
com.datastax.driver.core.exceptions.InvalidQueryException: SERIAL is not supported as conditional update commit consistency. Use ANY if you mean "make sure it is accepted but I don't care how many replicas commit it for non-SERIAL reads"
    at com.datastax.driver.core.Responses$Error.asException(Responses.java:136)
    at com.datastax.driver.core.DefaultResultSetFuture.onSet(DefaultResultSetFuture.java:179)
    at com.datastax.driver.core.RequestHandler.setFinalResult(RequestHandler.java:173)
    at com.datastax.driver.core.RequestHandler.access$2500(RequestHandler.java:43)
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.setFinalResult(RequestHandler.java:788)
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:607)
    at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1012)
    at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:935)
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
    at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:831)
    at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:346)
    at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:254)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)

如果您有任何解决方法的建议,我们将不胜感激。提前谢谢

someRdd中有什么?我可以想象这对于某些集合是不可能的。有一些例子是这个
案例类SomeObj(value:String,name:String,id:String,ts:Timestamp,tag:String,valid:Boolean)扩展了可序列化的
表描述
创建表个人\u id(名称文本、值文本、id文本、ts时间戳、标记文本、主键((名称、值)、id));
Ok对此不确定,我会向Spark Cassandra Connector提交Jira您使用的是什么版本的datastax驱动程序?
ERROR writer.QueryExecutor: Failed to execute: com.datastax.spark.connector.writer.RichBoundStatement@6ad8868c
com.datastax.driver.core.exceptions.InvalidQueryException: SERIAL is not supported as conditional update commit consistency. Use ANY if you mean "make sure it is accepted but I don't care how many replicas commit it for non-SERIAL reads"
    at com.datastax.driver.core.Responses$Error.asException(Responses.java:136)
    at com.datastax.driver.core.DefaultResultSetFuture.onSet(DefaultResultSetFuture.java:179)
    at com.datastax.driver.core.RequestHandler.setFinalResult(RequestHandler.java:173)
    at com.datastax.driver.core.RequestHandler.access$2500(RequestHandler.java:43)
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.setFinalResult(RequestHandler.java:788)
    at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:607)
    at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1012)
    at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:935)
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
    at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:831)
    at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:346)
    at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:254)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)