计算R中1年时间窗口内的累计产品
我有一个问题,要求我计算一系列1期收益的滚动乘积。滚动窗口的长度是可变的。目的是获得1期回报的滚动产品,该产品尽可能涵盖12个月的窗口 我已经能够通过计算R中1年时间窗口内的累计产品,r,xts,rollapply,R,Xts,Rollapply,我有一个问题,要求我计算一系列1期收益的滚动乘积。滚动窗口的长度是可变的。目的是获得1期回报的滚动产品,该产品尽可能涵盖12个月的窗口 我已经能够通过for循环和if语句使用蛮力生成一个有效的解决方案,但是我想知道是否有一个优雅的解决方案。我花了很多时间尝试使用rollappy和其他类似的函数,但一直没有找到解决方案 下面的数据说明了这个问题 date rt_1_period rt_12_mth_window 1 04-04-13 NA NA 2
for
循环和if
语句使用蛮力生成一个有效的解决方案,但是我想知道是否有一个优雅的解决方案。我花了很多时间尝试使用rollappy
和其他类似的函数,但一直没有找到解决方案
下面的数据说明了这个问题
date rt_1_period rt_12_mth_window
1 04-04-13 NA NA
2 10-04-13 0.729096362 NA
3 24-05-13 1.002535647 NA
4 30-05-13 0.993675716 NA
5 21-07-13 1.002662843 NA
6 03-08-13 1.009516582 NA
7 01-09-13 0.963099395 NA
8 20-10-13 1.012470278 NA
9 25-10-13 1.01308502 NA
10 03-11-13 1.005440704 NA
11 01-01-14 1.024208021 NA
12 11-01-14 0.996613924 NA
13 17-02-14 1.009811368 NA
14 24-02-14 1.008139557 NA
15 30-03-14 1.002794709 NA
16 30-04-14 0.998745849 1.042345473
17 02-05-14 1.002324076 1.044767963
18 27-06-14 0.997741026 1.046389027
19 24-08-14 1.015767546 1.050072129
20 05-09-14 1.014405005 1.106010894
21 02-11-14 1.013830296 1.09319212
22 09-11-14 1.013127219 1.101549487
23 16-11-14 1.012614177 1.115444628
24 18-01-15 0.986893629 1.078458006
25 24-01-15 1.028120919 1.108785236
26 10-04-15 0.912452762 0.991025615
27 09-08-15 1.004676152 0.981376513
28 07-01-16 1.004236123 0.934086003
29 01-04-16 1.02341302 0.94215696
在该示例中,第29行的12个月回报率计算为第26行到第29行的1个期间回报率的乘积,因为02-04-15(01-04-16的365天)包含在第25行和第26行之间。另一方面,第15行的12个月回报率为NA,因为30-03-13(从30-03-14开始的365天)超出了我可以观察到的1期回报的时间窗口
如果有人能提出解决这个问题的办法,我会很高兴的
为了清楚起见,如果所提供的数据没有多大意义,那是因为这是我为演示目的创建的大型数据库的精简版本。您可以使用
xts
和lubridate
来简化日期操作
数据:
require(xts)
require(lubridate)
DF = read.csv(text="
date,rt_1_period,rt_12_mth_window
04-04-13, ,
10-04-13,0.729096362,
24-05-13,1.002535647,
30-05-13,0.993675716,
21-07-13,1.002662843,
03-08-13,1.009516582,
01-09-13,0.963099395,
20-10-13,1.012470278,
25-10-13,1.01308502 ,
03-11-13,1.005440704,
01-01-14,1.024208021,
11-01-14,0.996613924,
17-02-14,1.009811368,
24-02-14,1.008139557,
30-03-14,1.002794709,
30-04-14,0.998745849,1.042345473
02-05-14,1.002324076,1.044767963
27-06-14,0.997741026,1.046389027
24-08-14,1.015767546,1.050072129
05-09-14,1.014405005,1.106010894
02-11-14,1.013830296,1.09319212
09-11-14,1.013127219,1.101549487
16-11-14,1.012614177,1.115444628
18-01-15,0.986893629,1.078458006
24-01-15,1.028120919,1.108785236
10-04-15,0.912452762,0.991025615
09-08-15,1.004676152,0.981376513
07-01-16,1.004236123,0.934086003
01-04-16,1.02341302 ,0.94215696",header=TRUE,stringsAsFactors=FALSE,na.strings="")
#Convert to xts time series for ease in date manipulation
DF_xts = xts(DF[,-1],order.by = as.Date(DF[,1],format="%d-%m-%y"))
head(DF_xts)
#
# rt_1_period rt_12_mth_window
#2013-04-04 NA NA
#2013-04-10 0.729096362 NA
#2013-05-24 1.002535647 NA
#2013-05-30 0.993675716 NA
#2013-07-21 1.002662843 NA
#2013-08-03 1.009516582 NA
#set lag period as 1 year
lagPeriod = 1
#Merge with original time series for final dataset
new_DF = merge.xts(DF_xts,rt_12_mth_window_Calc)
#Calculate difference in original and calculated 12 month returns
new_DF$delta = new_DF$rt_12_mth_window_Calc - new_DF$rt_12_mth_window
new_DF
# rt_1_period rt_12_mth_window rt_12_mth_window_Calc delta
#2013-04-04 NA NA NA NA
#2013-04-10 0.729096362 NA NA NA
#2013-05-24 1.002535647 NA NA NA
#2013-05-30 0.993675716 NA NA NA
#2013-07-21 1.002662843 NA NA NA
#2013-08-03 1.009516582 NA NA NA
#2013-09-01 0.963099395 NA NA NA
#2013-10-20 1.012470278 NA NA NA
#2013-10-25 1.013085020 NA NA NA
#2013-11-03 1.005440704 NA NA NA
#2014-01-01 1.024208021 NA NA NA
#2014-01-11 0.996613924 NA NA NA
#2014-02-17 1.009811368 NA NA NA
#2014-02-24 1.008139557 NA NA NA
#2014-03-30 1.002794709 NA NA NA
#2014-04-30 0.998745849 1.042345473 1.042345470 -2.64001643e-09
#2014-05-02 1.002324076 1.044767963 1.044767960 -2.54864396e-09
#2014-06-27 0.997741026 1.046389027 1.046389025 -1.97754613e-09
#2014-08-24 1.015767546 1.050072129 1.050072127 -1.66086833e-09
#2014-09-05 1.014405005 1.106010894 1.106010893 -1.34046041e-09
#2014-11-02 1.013830296 1.093192120 1.093192120 -6.47777387e-11
#2014-11-09 1.013127219 1.101549487 1.101549488 5.99306826e-10
#2014-11-16 1.012614177 1.115444628 1.115444628 -1.89856353e-10
#2015-01-18 0.986893629 1.078458006 1.078458005 -1.15637744e-09
#2015-01-24 1.028120919 1.108785236 1.108785235 -9.57268265e-10
#2015-04-10 0.912452762 0.991025615 0.991025613 -1.54581248e-09
#2015-08-09 1.004676152 0.981376513 0.996850412 1.54738992e-02
#2016-01-07 1.004236123 0.934086003 0.934086002 -9.15302278e-10
#2016-04-01 1.023413020 0.942156960 0.942156960 -1.82048598e-10
累计1200万件产品:
require(xts)
require(lubridate)
DF = read.csv(text="
date,rt_1_period,rt_12_mth_window
04-04-13, ,
10-04-13,0.729096362,
24-05-13,1.002535647,
30-05-13,0.993675716,
21-07-13,1.002662843,
03-08-13,1.009516582,
01-09-13,0.963099395,
20-10-13,1.012470278,
25-10-13,1.01308502 ,
03-11-13,1.005440704,
01-01-14,1.024208021,
11-01-14,0.996613924,
17-02-14,1.009811368,
24-02-14,1.008139557,
30-03-14,1.002794709,
30-04-14,0.998745849,1.042345473
02-05-14,1.002324076,1.044767963
27-06-14,0.997741026,1.046389027
24-08-14,1.015767546,1.050072129
05-09-14,1.014405005,1.106010894
02-11-14,1.013830296,1.09319212
09-11-14,1.013127219,1.101549487
16-11-14,1.012614177,1.115444628
18-01-15,0.986893629,1.078458006
24-01-15,1.028120919,1.108785236
10-04-15,0.912452762,0.991025615
09-08-15,1.004676152,0.981376513
07-01-16,1.004236123,0.934086003
01-04-16,1.02341302 ,0.94215696",header=TRUE,stringsAsFactors=FALSE,na.strings="")
#Convert to xts time series for ease in date manipulation
DF_xts = xts(DF[,-1],order.by = as.Date(DF[,1],format="%d-%m-%y"))
head(DF_xts)
#
# rt_1_period rt_12_mth_window
#2013-04-04 NA NA
#2013-04-10 0.729096362 NA
#2013-05-24 1.002535647 NA
#2013-05-30 0.993675716 NA
#2013-07-21 1.002662843 NA
#2013-08-03 1.009516582 NA
#set lag period as 1 year
lagPeriod = 1
#Merge with original time series for final dataset
new_DF = merge.xts(DF_xts,rt_12_mth_window_Calc)
#Calculate difference in original and calculated 12 month returns
new_DF$delta = new_DF$rt_12_mth_window_Calc - new_DF$rt_12_mth_window
new_DF
# rt_1_period rt_12_mth_window rt_12_mth_window_Calc delta
#2013-04-04 NA NA NA NA
#2013-04-10 0.729096362 NA NA NA
#2013-05-24 1.002535647 NA NA NA
#2013-05-30 0.993675716 NA NA NA
#2013-07-21 1.002662843 NA NA NA
#2013-08-03 1.009516582 NA NA NA
#2013-09-01 0.963099395 NA NA NA
#2013-10-20 1.012470278 NA NA NA
#2013-10-25 1.013085020 NA NA NA
#2013-11-03 1.005440704 NA NA NA
#2014-01-01 1.024208021 NA NA NA
#2014-01-11 0.996613924 NA NA NA
#2014-02-17 1.009811368 NA NA NA
#2014-02-24 1.008139557 NA NA NA
#2014-03-30 1.002794709 NA NA NA
#2014-04-30 0.998745849 1.042345473 1.042345470 -2.64001643e-09
#2014-05-02 1.002324076 1.044767963 1.044767960 -2.54864396e-09
#2014-06-27 0.997741026 1.046389027 1.046389025 -1.97754613e-09
#2014-08-24 1.015767546 1.050072129 1.050072127 -1.66086833e-09
#2014-09-05 1.014405005 1.106010894 1.106010893 -1.34046041e-09
#2014-11-02 1.013830296 1.093192120 1.093192120 -6.47777387e-11
#2014-11-09 1.013127219 1.101549487 1.101549488 5.99306826e-10
#2014-11-16 1.012614177 1.115444628 1.115444628 -1.89856353e-10
#2015-01-18 0.986893629 1.078458006 1.078458005 -1.15637744e-09
#2015-01-24 1.028120919 1.108785236 1.108785235 -9.57268265e-10
#2015-04-10 0.912452762 0.991025615 0.991025613 -1.54581248e-09
#2015-08-09 1.004676152 0.981376513 0.996850412 1.54738992e-02
#2016-01-07 1.004236123 0.934086003 0.934086002 -9.15302278e-10
#2016-04-01 1.023413020 0.942156960 0.942156960 -1.82048598e-10
对于每个日期,构造一个窗口[prevYearDate,date],该窗口中的返回值,计算累积产品并选择最后一个产品
rt_12_mth_window_Calc = do.call(rbind,lapply(as.Date(index(DF_xts)),function(x) {
prevYearDate = x-years(lagPeriod)
rt_12_mth_window_Calc = last(cumprod(DF_xts[paste0(prevYearDate,"/",x),"rt_1_period"]))
colnames(rt_12_mth_window_Calc) = "rt_12_mth_window_Calc"
return(rt_12_mth_window_Calc)
}))
最终数据集:
require(xts)
require(lubridate)
DF = read.csv(text="
date,rt_1_period,rt_12_mth_window
04-04-13, ,
10-04-13,0.729096362,
24-05-13,1.002535647,
30-05-13,0.993675716,
21-07-13,1.002662843,
03-08-13,1.009516582,
01-09-13,0.963099395,
20-10-13,1.012470278,
25-10-13,1.01308502 ,
03-11-13,1.005440704,
01-01-14,1.024208021,
11-01-14,0.996613924,
17-02-14,1.009811368,
24-02-14,1.008139557,
30-03-14,1.002794709,
30-04-14,0.998745849,1.042345473
02-05-14,1.002324076,1.044767963
27-06-14,0.997741026,1.046389027
24-08-14,1.015767546,1.050072129
05-09-14,1.014405005,1.106010894
02-11-14,1.013830296,1.09319212
09-11-14,1.013127219,1.101549487
16-11-14,1.012614177,1.115444628
18-01-15,0.986893629,1.078458006
24-01-15,1.028120919,1.108785236
10-04-15,0.912452762,0.991025615
09-08-15,1.004676152,0.981376513
07-01-16,1.004236123,0.934086003
01-04-16,1.02341302 ,0.94215696",header=TRUE,stringsAsFactors=FALSE,na.strings="")
#Convert to xts time series for ease in date manipulation
DF_xts = xts(DF[,-1],order.by = as.Date(DF[,1],format="%d-%m-%y"))
head(DF_xts)
#
# rt_1_period rt_12_mth_window
#2013-04-04 NA NA
#2013-04-10 0.729096362 NA
#2013-05-24 1.002535647 NA
#2013-05-30 0.993675716 NA
#2013-07-21 1.002662843 NA
#2013-08-03 1.009516582 NA
#set lag period as 1 year
lagPeriod = 1
#Merge with original time series for final dataset
new_DF = merge.xts(DF_xts,rt_12_mth_window_Calc)
#Calculate difference in original and calculated 12 month returns
new_DF$delta = new_DF$rt_12_mth_window_Calc - new_DF$rt_12_mth_window
new_DF
# rt_1_period rt_12_mth_window rt_12_mth_window_Calc delta
#2013-04-04 NA NA NA NA
#2013-04-10 0.729096362 NA NA NA
#2013-05-24 1.002535647 NA NA NA
#2013-05-30 0.993675716 NA NA NA
#2013-07-21 1.002662843 NA NA NA
#2013-08-03 1.009516582 NA NA NA
#2013-09-01 0.963099395 NA NA NA
#2013-10-20 1.012470278 NA NA NA
#2013-10-25 1.013085020 NA NA NA
#2013-11-03 1.005440704 NA NA NA
#2014-01-01 1.024208021 NA NA NA
#2014-01-11 0.996613924 NA NA NA
#2014-02-17 1.009811368 NA NA NA
#2014-02-24 1.008139557 NA NA NA
#2014-03-30 1.002794709 NA NA NA
#2014-04-30 0.998745849 1.042345473 1.042345470 -2.64001643e-09
#2014-05-02 1.002324076 1.044767963 1.044767960 -2.54864396e-09
#2014-06-27 0.997741026 1.046389027 1.046389025 -1.97754613e-09
#2014-08-24 1.015767546 1.050072129 1.050072127 -1.66086833e-09
#2014-09-05 1.014405005 1.106010894 1.106010893 -1.34046041e-09
#2014-11-02 1.013830296 1.093192120 1.093192120 -6.47777387e-11
#2014-11-09 1.013127219 1.101549487 1.101549488 5.99306826e-10
#2014-11-16 1.012614177 1.115444628 1.115444628 -1.89856353e-10
#2015-01-18 0.986893629 1.078458006 1.078458005 -1.15637744e-09
#2015-01-24 1.028120919 1.108785236 1.108785235 -9.57268265e-10
#2015-04-10 0.912452762 0.991025615 0.991025613 -1.54581248e-09
#2015-08-09 1.004676152 0.981376513 0.996850412 1.54738992e-02
#2016-01-07 1.004236123 0.934086003 0.934086002 -9.15302278e-10
#2016-04-01 1.023413020 0.942156960 0.942156960 -1.82048598e-10
计算值和原始值非常接近所有观测值,除了
2015-08-09,数值偏差为1.55%,您能确认这段时间的计算结果吗?这里有一个只依赖于xts的解决方案,可能更直接一些
library(xts)
x <- as.xts(read.zoo(text="date,rt_1_period,rt_12_mth_window
04-04-13, ,
10-04-13,0.729096362,
24-05-13,1.002535647,
30-05-13,0.993675716,
21-07-13,1.002662843,
03-08-13,1.009516582,
01-09-13,0.963099395,
20-10-13,1.012470278,
25-10-13,1.013085020,
03-11-13,1.005440704,
01-01-14,1.024208021,
11-01-14,0.996613924,
17-02-14,1.009811368,
24-02-14,1.008139557,
30-03-14,1.002794709,
30-04-14,0.998745849,1.042345473
02-05-14,1.002324076,1.044767963
27-06-14,0.997741026,1.046389027
24-08-14,1.015767546,1.050072129
05-09-14,1.014405005,1.106010894
02-11-14,1.013830296,1.09319212
09-11-14,1.013127219,1.101549487
16-11-14,1.012614177,1.115444628
18-01-15,0.986893629,1.078458006
24-01-15,1.028120919,1.108785236
10-04-15,0.912452762,0.991025615
09-08-15,1.004676152,0.981376513
07-01-16,1.004236123,0.934086003
01-04-16,1.023413020,0.94215696", header=TRUE, sep=",", format="%d-%m-%y"))
ix <- index(x) # index values
ixlag <- ix-365 # 1-year lag index values
x$rt_12 <- NA_real_ # initialize result column
for(i in which(ixlag > ix[1])) {
# 1-year subset
xyear <- window(x, start=ixlag[i], end=ix[i])
# calculate product and update result column
x[i,"rt_12"] <- prod(xyear[,"rt_1_period"])
}
库(xts)
非常感谢你的回答!非常感谢!你在2015-08-09上说得对。我使用的数据是从模拟Excel电子表格复制粘贴的,您的版本是正确的。我已经开始使用您的代码,并且似乎工作得很好。谢谢刚刚尝试在真实数据集上运行代码,该数据集是我发布的内容的扩展版本,并在apply(coredata(x),2,函数(y)cumprod(y))中得到以下错误错误:dim(x)必须具有正长度
您知道什么可能是源代码吗?提前感谢您是否可以更新您的问题,并包括您正在进行的函数调用以及dput(head(data))
当数据
是您的输入数据时。只有这样,我们才能复制错误。我尝试了你发布的代码片段,只做了一次更改,它运行良好,将rt_1_mth
替换为rt_1_period
,查看原子向量的链接并应用你是正确的!为这个愚蠢的错误道歉,这是我们的初衷。将删除对问题的编辑,因为它与问题无关。再次感谢!感谢您的替代解决方案!