Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
R 数据库';默认值';找不到你_R_Apache Spark_Google Cloud Platform_Sparklyr - Fatal编程技术网

R 数据库';默认值';找不到你

R 数据库';默认值';找不到你,r,apache-spark,google-cloud-platform,sparklyr,R,Apache Spark,Google Cloud Platform,Sparklyr,我收到此间歇性错误“org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException:未找到数据库“default;” 我在R-studio中运行脚本并随机收到此错误。如果我继续使用Spark UI,那么它不会显示任何失败。 如果我再次运行代码,代码将成功执行。基本上,相同的代码有时会出现此错误,有时会成功执行。 我想知道为什么有时找不到默认数据库,为什么有时会找到它。 大多数情况下,此错误发生在计算时。如以下代码所示: baseDat

我收到此间歇性错误“org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException:未找到数据库“default;” 我在R-studio中运行脚本并随机收到此错误。如果我继续使用Spark UI,那么它不会显示任何失败。 如果我再次运行代码,代码将成功执行。基本上,相同的代码有时会出现此错误,有时会成功执行。 我想知道为什么有时找不到默认数据库,为什么有时会找到它。 大多数情况下,此错误发生在计算时。如以下代码所示:

baseData <- tbl(sc,"baseData")
baseData %>% 
  filter(spinNo==1) %>%
  transmute(
    sessionIdNew,
    initial = initial,
    initial2 = MM, 
    initiabet = tbet,
    initial_bet_to_bank_ratio = totalbet/bank,
    initial_balance = bank + moneyMeter,
    initial_cumCoinIn = cumCoinInNew,
    sessionFirstCoinIn = firstCoinIn
  ) %>% compute("initial_variables")
baseData%
过滤器(spinNo==1)%>%
蜕变(
会话新建,
初始值=初始值,
初始值2=毫米,
initiabet=tbet,
初始投注与银行的比率=总投注/银行,
初始余额=银行+计价器,
初始值=cumCoinInNew,
sessionFirstCoinIn=firstCoinIn
)%%>%计算(“初始变量”)
以下是全部错误:

Error: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'default' not found;
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireDbExists(SessionCatalog.scala:178)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listTables(SessionCatalog.scala:764)
at org.apache.spark.sql.execution.command.ShowTablesCommand$$anonfun$15.apply(tables.scala:791)
at org.apache.spark.sql.execution.command.ShowTablesCommand$$anonfun$15.apply(tables.scala:791)
at scala.Option.map(Option.scala:146)
at org.apache.spark.sql.execution.command.ShowTablesCommand.run(tables.scala:791)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3370)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3369)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sparklyr.Invoke.invoke(invoke.scala:147)
at sparklyr.StreamHandler.handleMethodCall(stream.scala:136)
at sparklyr.StreamHandler.read(stream.scala:61)
at sparklyr.BackendHandler$$anonfun$channelRead0$1.apply$mcV$sp(handler.scala:58)
at scala.util.control.Breaks.breakable(Breaks.scala:38)
at sparklyr.BackendHandler.channelRead0(handler.scala:38)
at sparklyr.BackendHandler.channelRead0(handler.scala:14)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:328)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:302)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1422)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:931)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:700)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:635)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:552)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:514)
at io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
错误:org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException:未找到数据库“default”;
位于org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireDbExists(SessionCatalog.scala:178)
位于org.apache.spark.sql.catalyst.catalog.SessionCatalog.listTables(SessionCatalog.scala:764)
位于org.apache.spark.sql.execution.command.ShowTablesCommand$$anonfun$15.apply(tables.scala:791)
位于org.apache.spark.sql.execution.command.ShowTablesCommand$$anonfun$15.apply(tables.scala:791)
在scala.Option.map处(Option.scala:146)
在org.apache.spark.sql.execution.command.ShowTablesCommand.run(tables.scala:791)
位于org.apache.spark.sql.execution.command.executeCommandExec.sideEffectResult$lzycompute(commands.scala:70)
位于org.apache.spark.sql.execution.command.executeCommandExec.sideEffectResult(commands.scala:68)
位于org.apache.spark.sql.execution.command.executeCommandExec.executeCollect(commands.scala:79)
位于org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
位于org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
位于org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3370)
位于org.apache.spark.sql.execution.SQLExecution$$anonfun$和newexecutionid$1.apply(SQLExecution.scala:80)
在org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
位于org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
位于org.apache.spark.sql.Dataset.withAction(Dataset.scala:3369)
位于org.apache.spark.sql.Dataset(Dataset.scala:194)
位于org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
位于org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
在sparklyr.Invoke.Invoke(Invoke.scala:147)
在sparklyr.StreamHandler.handleMethodCall(stream.scala:136)
在sparklyr.StreamHandler.read(stream.scala:61)
在sparklyr.BackendHandler$$anonfun$channelRead0$1.apply$mcV$sp(handler.scala:58)
位于scala.util.control.Breaks.breakable(Breaks.scala:38)
在sparklyr.BackendHandler.channelRead0(handler.scala:38)
在sparklyr.BackendHandler.channelRead0(handler.scala:14)
位于io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360)
位于io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352)
位于io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360)
位于io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352)
位于io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:328)
位于io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:302)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360)
位于io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352)
位于io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1422)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374)
位于io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360)
位于io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:931)
位于io.netty.channel.nio.AbstractNioByteChannel$niobyteensafe.read(AbstractNioByteChannel.java:163)
位于io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:700)
在io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized处(NioEventLoop.java:635)
位于io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:552)
位于io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:514)
位于io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
位于io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
在io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
运行(Thread.java:748)
注意:-我使用的是谷歌云平台。Debian 9、Hadoop 2.9、Spark 2.3、RSTUDIO_服务器_版本=1.2.5019 稀疏的1。