Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/r/82.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/ssis/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
R 合并矩阵的列表列表_R_Zoo - Fatal编程技术网

R 合并矩阵的列表列表

R 合并矩阵的列表列表,r,zoo,R,Zoo,很抱歉问了一个在你们看来很基本的问题,关于我在R里要做的事情 我的数据是一个列表列表 dataset $Series1 date Value 2015-11-01 1.301 2015-11-02 6.016 2015-11-03 4.871 2015-11-04 10.925 2015-11-05 7.638 $Series2 date Value 2015-11-01 1.532 2015-11-02 3.730 2015-11-03 6.910 2015-11-04

很抱歉问了一个在你们看来很基本的问题,关于我在R里要做的事情

我的数据是一个列表列表

dataset 
$Series1
date      Value
2015-11-01 1.301
2015-11-02 6.016
2015-11-03 4.871
2015-11-04 10.925
2015-11-05 7.638

$Series2
date      Value
2015-11-01 1.532
2015-11-02 3.730
2015-11-03 6.910
2015-11-04 3.554
2015-11-05 2.631
关于如何转换为以下内容有什么想法吗

datamatrix
date      Series1 Series2
2015-11-01 1.301  1.532
2015-11-02 6.016  3.730
2015-11-03 4.871  6.910
2015-11-04 10.925 3.554
2015-11-05 7.638  2.631

您可以使用
Reduce
合并时间序列

dataset = list(Series1=read.table(text="
date      Value
2015-11-01 1.301
2015-11-02 6.016
2015-11-03 4.871
2015-11-04 10.925
2015-11-05 7.638",header=TRUE,stringsAsFactors=FALSE),Series2=read.table(text="
date      Value
2015-11-01 1.532
2015-11-02 3.730
2015-11-03 6.910
2015-11-04 3.554
2015-11-05 2.631",header=TRUE,stringsAsFactors=FALSE))

mergeFun = function(x,y) merge(x,y,by="date") 

datamatrix = Reduce(mergeFun,dataset)
colnames(datamatrix) = c("date",names(dataset))

datamatrix
#       date Series1 Series2
# 2015-11-01   1.301   1.532
# 2015-11-02   6.016   3.730
# 2015-11-03   4.871   6.910
# 2015-11-04  10.925   3.554
# 2015-11-05   7.638   2.631
为多个系列使用
xts

我在数据集中添加了额外的列,使用
xts
可以执行以下操作

require(xts)

dataset = list(Series1=read.table(text="
date      Value
2015-11-01 1.301
2015-11-02 6.016
2015-11-03 4.871
2015-11-04 10.925
2015-11-05 7.638",header=TRUE,stringsAsFactors=FALSE),Series2=read.table(text="
date      Value
2015-11-01 1.532
2015-11-02 3.730
2015-11-03 6.910
2015-11-04 3.554
2015-11-05 2.631",header=TRUE,stringsAsFactors=FALSE),Series3=read.table(text="
date      Value
2015-11-01 1
2015-11-02 3
2015-11-03 6
2015-11-04 3
2015-11-05 2",header=TRUE,stringsAsFactors=FALSE),Series4=read.table(text="
date      Value
2015-11-01 1.1
2015-11-02 3.2
2015-11-03 6.3
2015-11-04 3.4
2015-11-05 2.5",header=TRUE,stringsAsFactors=FALSE))


datasetXTS = lapply(dataset,function(x) {
z=xts(x[,-1],order.by=as.Date(x[,1],format="%Y-%m-%d"));
colnames(z) = tail(colnames(x),1);
z 
})

datamatrix = Reduce(merge,datasetXTS)

datamatrix
#            Value Value.1 Value.2 Value.3
#2015-11-01  1.301   1.532       1     1.1
#2015-11-02  6.016   3.730       3     3.2
#2015-11-03  4.871   6.910       6     6.3
#2015-11-04 10.925   3.554       3     3.4
#2015-11-05  7.638   2.631       2     2.
序列已正确合并,但由于在所有序列中具有相同的列名,因此它们会重复。要解决这个问题:

colnames(datamatrix) = names(dataset)

datamatrix
#           Series1 Series2 Series3 Series4
#2015-11-01   1.301   1.532       1     1.1
#2015-11-02   6.016   3.730       3     3.2
#2015-11-03   4.871   6.910       6     6.3
#2015-11-04  10.925   3.554       3     3.4
#2015-11-05   7.638   2.631       2     2.5

你确定这是一个列表吗?是的,但这有关系吗?谢谢你的回复。但是,如果列表中有3个以上的序列,则此方法不起作用?它适用于列表中的任何数字序列,前提是它们具有相同的公共列警告消息:在merge.data.frame(x,y,by=“date”):列名“Value.x”,“Value.y”在结果中重复。如果有3个以上的系列,这是我得到的错误。请告诉我新的解决方案是否适用于您,非常感谢