Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/ssis/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 如何在将数据帧复制到Spark时修复Spark中的阶段故障错误?_Apache Spark_Databricks_Sparklyr - Fatal编程技术网

Apache spark 如何在将数据帧复制到Spark时修复Spark中的阶段故障错误?

Apache spark 如何在将数据帧复制到Spark时修复Spark中的阶段故障错误?,apache-spark,databricks,sparklyr,Apache Spark,Databricks,Sparklyr,我一直在为这件事苦苦挣扎。在不同的执行时间,我总是会遇到不同的错误 我使用cli将大于4 GB的文件复制到dbfs文件存储。我想把filestore中的csv文件复制到spark,但不知道如何复制。因此,我使用r读取文件,然后尝试将u复制到spark,但是我得到了以下错误 火花课程 sc <- spark_connect(method = "databricks", spark_home = Sys.getenv("SPARK_

我一直在为这件事苦苦挣扎。在不同的执行时间,我总是会遇到不同的错误

我使用cli将大于4 GB的文件复制到dbfs文件存储。我想把filestore中的csv文件复制到spark,但不知道如何复制。因此,我使用r读取文件,然后尝试将u复制到spark,但是我得到了以下错误

火花课程

sc <- spark_connect(method = "databricks",
                spark_home = Sys.getenv("SPARK_HOME"),
                version = "2.4")

R version 3.6.3 (2020-02-29)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Ubuntu 18.04.5 LTS

Matrix products: default
BLAS:   /usr/lib/x86_64-linux-gnu/openblas/libblas.so.3
LAPACK: /usr/lib/x86_64-linux-gnu/libopenblasp-r0.2.20.so

locale:
 [1] LC_CTYPE=en_US.UTF-8       LC_NUMERIC=C              
 [3] LC_TIME=en_US.UTF-8        LC_COLLATE=en_US.UTF-8    
 [5] LC_MONETARY=en_US.UTF-8    LC_MESSAGES=en_US.UTF-8   
 [7] LC_PAPER=en_US.UTF-8       LC_NAME=C                 
 [9] LC_ADDRESS=C               LC_TELEPHONE=C            
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C       

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
 [1] rlang_0.4.7     sparklyr_1.3.1  forcats_0.5.0   stringr_1.4.0  
 [5] dplyr_1.0.2     purrr_0.3.4     readr_1.3.1     tidyr_1.1.2    
 [9] tibble_3.0.3    ggplot2_3.3.0   tidyverse_1.3.0

loaded via a namespace (and not attached):
 [1] httr_1.4.2         pkgload_1.0.2      jsonlite_1.7.1     modelr_0.1.6      
 [5] assertthat_0.2.1   blob_1.2.1         cellranger_1.1.0   yaml_2.2.1        
 [9] remotes_2.2.0      r2d3_0.2.3         sessioninfo_1.1.1  pillar_1.4.6      
[13] backports_1.1.9    lattice_0.20-41    glue_1.4.2         digest_0.6.25     
[17] rvest_0.3.5        colorspace_1.4-1   htmltools_0.5.0    pkgconfig_2.0.3   
[21] devtools_2.3.1     broom_0.5.6        haven_2.3.1        config_0.3        
[25] scales_1.1.0       processx_3.4.2     TeachingDemos_2.10 generics_0.0.2    
[29] usethis_1.6.0      ellipsis_0.3.1     withr_2.2.0        cli_2.0.2         
[33] magrittr_1.5       crayon_1.3.4       Rserve_1.8-7       readxl_1.3.1      
[37] memoise_1.1.0      ps_1.3.2           fs_1.4.1           fansi_0.4.1       
[41] nlme_3.1-147       xml2_1.3.2         hwriter_1.3.2      pkgbuild_1.0.6    
[45] tools_3.6.3        prettyunits_1.1.1  hms_0.5.3          lifecycle_0.2.0   
[49] munsell_0.5.0      reprex_0.3.0       callr_3.4.3        compiler_3.6.3    
[53] forge_0.2.0        grid_3.6.3         rstudioapi_0.11    htmlwidgets_1.5.1 
[57] base64enc_0.1-3    testthat_2.3.2     gtable_0.3.0       DBI_1.1.0         
[61] curl_4.3           R6_2.4.1           hwriterPlus_1.0-3  lubridate_1.7.8   
[65] rprojroot_1.3-2    desc_1.2.0         stringi_1.5.3      parallel_3.6.3    
[69] Rcpp_1.0.4.6       vctrs_0.3.4        SparkR_3.0.0       dbplyr_1.4.4      
[73] tidyselect_1.1.0  
sc
df <- read.csv(file_location, header = T, na.strings=c(" ", "", "NA"))
## copy files to spark cluster

df_tbl <- copy_to(sc, df=df, name="df_tbl")