Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/r/83.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
R 重复读取html并另存为csv_R_Loops - Fatal编程技术网

R 重复读取html并另存为csv

R 重复读取html并另存为csv,r,loops,R,Loops,我需要反复阅读(和处理)一个html并保存为csv超过500次。你能帮我写一个循环吗?比如说, url <- "https://www.sec.gov/Archives/edgar/data/944508/000104746914007395/a2221329z424b7.htm" write.csv(sent, "1.csv") url首先尝试将所有url存储在一个向量中。之后,试试这个 csv_names <- NULL for(i in 1:500) csv_names[i]

我需要反复阅读(和处理)一个html并保存为csv超过500次。你能帮我写一个循环吗?比如说,

url <- "https://www.sec.gov/Archives/edgar/data/944508/000104746914007395/a2221329z424b7.htm"
write.csv(sent, "1.csv")

url首先尝试将所有url存储在一个向量中。之后,试试这个

csv_names <- NULL
for(i in 1:500) csv_names[i] <- paste(as.character(i),".csv",sep="")

for(i in 1:500){
  write.csv(URL[i],file=csv_names[i])
}

csv\u name第一步创建URL向量:

url_vec<-c("url1","url2")

url\u vec您的代码中发送了什么
csv_vec<-paste0(rep(1:length(url_vec)),".csv")

for(n in 1:length(url_vec)){

        [...start...] 
    (process to create csv)
         [...end...]

   write.csv(sent, csv_vec[n])
}