使用Apache Toree SparkR内核从Jupyter连接Spark时出错

使用Apache Toree SparkR内核从Jupyter连接Spark时出错,r,jupyter-notebook,sparkr,apache-toree,R,Jupyter Notebook,Sparkr,Apache Toree,我正在尝试使用Apache-Toree-SparkR内核从Jupyter连接到Spark 2.1.0。内核加载正确,但当我尝试执行一个单元格时,会出现一个错误并无限重复 使用Scala和Python内核连接到Spark非常有效。 通过RStudio使用R连接到Spark可以完美工作 错误日志: Loading required package: methods Attaching package: ‘SparkR’ The following objects are masked from

我正在尝试使用
Apache-Toree-SparkR内核从
Jupyter
连接到
Spark 2.1.0
。内核加载正确,但当我尝试执行一个单元格时,会出现一个错误并无限重复

使用Scala和Python内核连接到Spark非常有效。 通过RStudio使用R连接到Spark可以完美工作

错误日志:

Loading required package: methods

Attaching package: ‘SparkR’

The following objects are masked from ‘package:stats’:

`cov, filter, lag, na.omit, predict, sd, var, window`

The following objects are masked from ‘package:base’:

`as.data.frame, colnames, colnames<-, drop, endsWith, intersect,`
`rank, rbind, sample, startsWith, subset, summary, transform, union`
加载所需包:方法
附加包:“SparkR”
以下对象已从“package:stats”屏蔽:
`cov、过滤器、滞后、na.忽略、预测、sd、var、窗口`
以下对象已从“package:base”屏蔽:
`as.data.frame、colnames、colnames tryCatchList->tryCatchOne->
仲裁委员会
17/05/04 11:04:12[错误]o.a.t.k.i.s.SparkRProcessHandler-空进程已退出:1
`In rm(".sparkRcon", envir = .sparkREnv) : objeto` '.sparkRcon' no encontrado
[1] "ExistingPort:" "43101"        
Error in value[[3L]](cond) : 
  Failed to connect JVM: Error in socketConnection(host = hostname, port = port, server = FALSE, : el argumento "timeout" está ausente, sin valor por omisión
Calls: sparkR.connect ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous>
Ejecución interrumpida
17/05/04 11:04:12 [ERROR] o.a.t.k.i.s.SparkRProcessHandler - null process exited: 1