无法使用SparkyR连接到Spark

无法使用SparkyR连接到Spark,r,apache-spark,sparklyr,R,Apache Spark,Sparklyr,我正在尝试使用R中的SparkyR软件包连接spark,但出现以下错误: library(sparklyr) > library(dplyr) > config <- spark_config() > config[["sparklyr.shell.conf"]] <- "spark.driver.extraJavaOptions=-XX:MaxHeapSize=4g" > sc <- spark_connect(master = "loc

我正在尝试使用R中的SparkyR软件包连接spark,但出现以下错误:

    library(sparklyr)
> library(dplyr)

> config <- spark_config()

> config[["sparklyr.shell.conf"]] <- "spark.driver.extraJavaOptions=-XX:MaxHeapSize=4g"

> sc <- spark_connect(master = "local",version = "1.6.2")

Error in force(code) : 
  Failed while connecting to sparklyr to port (8880) for sessionid (344): Gateway in port (8880) did not respond.
    Path: C:\Users\krispra\AppData\Local\rstudio\spark\Cache\spark-1.6.2-bin-hadoop2.6\bin\spark-submit2.cmd
    Parameters: --class, sparklyr.Backend, --jars, "C:/Users/krispra/Documents/R/R-3.3.2/library/sparklyr/java/spark-csv_2.11-1.3.0.jar","C:/Users/krispra/Documents/R/R-3.3.2/library/sparklyr/java/commons-csv-1.1.jar","C:/Users/krispra/Documents/R/R-3.3.2/library/sparklyr/java/univocity-parsers-1.5.1.jar", "C:\Users\krispra\Documents\R\R-3.3.2\library\sparklyr\java\sparklyr-1.6-2.10.jar", 8880, 344


---- Output Log ----
Error occurred during initialization of VM
Could not reserve enough space for 1048576KB object heap

---- Error Log ----
有什么解决方法的建议吗

谢谢!
Rami

我之前安装Sparker时遇到问题。我的解决方案是删除Sparkyr库并通过CRAN重新安装,然后重新启动Rstudio。

是否要使用Rstudio使用SparkR?一件事是确保您的独立spark或spark群集在后端启动并运行。spark的默认端口是8080,似乎spark未启动并运行。