Stata 逐个读取多个CSV文件时出现宏和变量问题

Stata 逐个读取多个CSV文件时出现宏和变量问题,stata,stata-macros,Stata,Stata Macros,以下是其中一个csv文件的一些可复制数据: * Example generated by -dataex-. To install: ssc install dataex clear input str27 eventname str10(eventdate scrapedate) byte part float(thpercentile median v7 mean) str5 timestamp int seatcount str19 scrapedatetime "Home1 vs. Aw

以下是其中一个
csv
文件的一些可复制数据:

* Example generated by -dataex-. To install: ssc install dataex
clear
input str27 eventname str10(eventdate scrapedate) byte part float(thpercentile median v7 mean) str5 timestamp int seatcount str19 scrapedatetime
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-15" 1        .       .        .         . "07:59"    0 "2015-12-15 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-15" 2        .       .        .         . "16:00"    0 "2015-12-15 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-15" 3    99.97   132.5   183.85 170.42963 "23:59" 1534 "2015-12-15 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-16" 1      100   132.5   185.25 170.95053 "07:59" 1528 "2015-12-16 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-16" 2  99.8725   132.5 185.6125  170.8983 "16:00" 1523 "2015-12-16 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-16" 3    99.61 132.925   183.85 170.56766 "23:59" 1493 "2015-12-16 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-17" 1    98.44   132.5   183.85   170.193 "07:59" 1490 "2015-12-17 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-17" 2      100  133.54 185.1425 171.12013 "16:00" 1465 "2015-12-17 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-17" 3    99.61   132.5   183.85  170.4387 "23:59" 1463 "2015-12-17 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-18" 1      100   132.5   183.85   170.051 "07:59" 1438 "2015-12-18 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-18" 2    98.44 132.925   183.85 170.05144 "16:00" 1427 "2015-12-18 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-18" 3   101.95  134.27   188.86 170.95193 "23:59" 1376 "2015-12-18 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-19" 1   101.95  133.95   188.75 171.24626 "07:59" 1366 "2015-12-19 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-19" 2   101.95  133.95   188.39 171.50464 "16:00" 1360 "2015-12-19 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-19" 3  105.355  139.39    189.7  173.4393 "23:59" 1320 "2015-12-19 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-20" 1   105.46  139.39   190.55  173.8773 "07:59" 1308 "2015-12-20 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-20" 2   105.46  139.39   190.79  174.0365 "16:00" 1290 "2015-12-20 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-20" 3   104.88  139.39   191.53  175.8205 "23:59" 1244 "2015-12-20 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-21" 1   105.17  138.22 191.7025 175.54225 "07:59" 1227 "2015-12-21 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-21" 2   105.68  139.39    189.7 175.63374 "16:00" 1213 "2015-12-21 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-21" 3   103.27 133.445    189.7 175.23582 "23:59" 1174 "2015-12-21 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-22" 1   106.09  135.77  197.695 177.64076 "07:59" 1161 "2015-12-22 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-22" 2   106.66 136.465 198.0175  178.2966 "16:00" 1155 "2015-12-22 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-22" 3   107.67  138.92  190.615   172.865 "23:59" 1214 "2015-12-22 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-23" 1    107.8  138.92 195.8425 174.13286 "07:59" 1190 "2015-12-23 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-23" 2    107.8  137.05   193.54  174.4463 "16:00" 1161 "2015-12-23 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-23" 3   112.48 139.025   195.55  175.9974 "23:59" 1118 "2015-12-23 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-24" 1   113.32   142.9  197.235  178.3136 "07:59" 1076 "2015-12-24 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-24" 2   113.65   142.9 202.8625  180.5185 "16:00" 1041 "2015-12-24 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-24" 3   113.65   142.9   204.25 181.71426 "23:59"  984 "2015-12-24 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-25" 1   117.13  146.46   207.25  184.9154 "07:59"  951 "2015-12-25 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-25" 2   118.33  147.58   207.25  187.8157 "16:00"  925 "2015-12-25 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-25" 3    119.5  148.75 220.0125 191.25423 "23:59"  854 "2015-12-25 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-26" 1    119.5  148.75   220.19  192.5282 "07:59"  826 "2015-12-26 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-26" 2    119.5 149.045 223.9225  194.0729 "16:00"  808 "2015-12-26 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-26" 3   125.24  150.89  231.555 196.03903 "23:59"  763 "2015-12-26 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-27" 1   125.24  149.85   222.74 189.37384 "07:59"  745 "2015-12-27 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-27" 2   125.24 149.045   222.74  188.5702 "16:00"  727 "2015-12-27 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-27" 3   125.24  150.21   234.16 191.70107 "23:59"  683 "2015-12-27 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-28" 1 123.5675   150.3 231.6875 190.37703 "07:59"  656 "2015-12-28 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-28" 2   124.55  152.06   230.65  189.7578 "16:00"  668 "2015-12-28 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-28" 3   125.24  153.43   230.65 188.21233 "23:59"  644 "2015-12-28 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-29" 1   125.35   154.6   230.65 188.78273 "07:59"  607 "2015-12-29 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-29" 2   128.34  158.59   236.03 194.44263 "16:00"  611 "2015-12-29 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-29" 3    123.5 157.985   226.35  192.8171 "23:59"  608 "2015-12-29 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-30" 1   129.55   159.8    227.5 195.97015 "07:59"  590 "2015-12-30 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-30" 2  135.485  164.64    227.5 198.30286 "16:00"  585 "2015-12-30 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-30" 3   129.55  158.59    220.3 191.47372 "23:59"  604 "2015-12-30 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-31" 1    123.5  157.38    220.3 190.71004 "07:59"  607 "2015-12-31 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-31" 2  126.015  158.59    220.3 190.33115 "16:00"  616 "2015-12-31 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2015-12-31" 3    123.5  154.97    208.2  178.5105 "23:59"  727 "2015-12-31 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-01" 1   122.29  153.75   206.99  174.5168 "07:59"  732 "2016-01-01 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-01" 2   122.29  152.54    205.3  172.2481 "16:00"  738 "2016-01-01 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-01" 3 113.8175 144.065 206.8725  165.0204 "23:59"  480 "2016-01-01 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-02" 1  112.605  138.02    208.2  164.2923 "07:59"  504 "2016-01-02 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-02" 2  114.575  138.02   209.09 166.25206 "16:00"  472 "2016-01-02 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-02" 3 109.7975  144.67   202.15  183.0381 "23:59"  409 "2016-01-02 23:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-03" 1   117.45  153.75   200.94   190.452 "07:59"  285 "2016-01-03 07:59:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-03" 2    111.4  153.75    196.1  188.8237 "16:00"  264 "2016-01-03 16:00:00"
"Home1 vs. Away1 on January 3rd" "2016-01-03" "2016-01-03" 3        .       .        .         . "23:59"    0 "2016-01-03 23:59:00"
end
我有多个这样的
csv
文件

我决定分别为它们编写代码,以读取中的
csv
,并执行代码,导出图形,使用
清除所有
,以及
宏删除所有
,以便删除变量和宏(在为下一个
csv
文件重复代码时将重新初始化它们),并重新运行相同的代码,仅在导入不同的
csv
文件后进行此操作

global directory "I:\Data\Useful CSVs"
global datadir "$directory\Games\GamesIndividual"
global outdir "I:\Data\figures"

/*********************************/
/*********************************/
/* Home1 vs. Away1 on January 3rd */
/*********************************/
/*********************************/

import delimited "$datadir\Home1 vs. Away1 on January 3rd", clear

/* Create a variable `eventtime` that captures the date portion 
from the `datetime` columns */

gen double eventtime = clock(scrapedatetime, "YMDhms")

/* Set time-series format */
tsset eventtime, format(%tcNN/DD/CCYY_HH:MM:SS)


/* The following code snippet gets the minimum and maximum raw date/time values, 
finds the interval between observations based on the desired steps 
(in this case 12), then loops over observations to get the date/time 
value at every step and inserts everything in a list: */

sort eventtime
summarize eventtime
local min = r(min)
local max = r(max)

local plus = _N / 5
local total = _N / `plus'

local dtlist `dtlist' `min'
local counter = 0

forvalues i = 1 / `total' {
    local counter = `counter' + `plus'
                local dtlist `dtlist' `=eventtime[`counter']' 
}

local dtlist `dtlist' `max'

/* You then draw a towway and append many connect lines.
The column variables are encoded differently when read into stata.

eventtime - ScrapeDate
median - Median Price in USD
thpercentile - 25th Percentile in USD
v7 - 75th Percentile in USD
mean - Mean Price in USD */
#delimit ;
twoway 
    (connected mean eventtime, msymbol(point) mfcolor(none)) 
    (connected median eventtime, msymbol(point) mfcolor(none))  
    (connected thpercentile eventtime, msymbol(point) mfcolor(none)) 
    (connected v7 eventtime, msymbol(point) mfcolor(none)), 
    title("Home1 vs. Away1 on January 3rd") 
    ytitle(Price in USD) 
    xtitle(Scrape Date) 
    leg(off)
    xlabel(`dtlist', format(%tCDDMon))
    xline(1765785540000 1766332800000 1766908740000, lwidth(thin))
    /* 

    Setting text placeholder for odds in date 
    representing BEFORE WEEK 15: 12/14/2015 

    */ 
    text(150 1765785540000 "P(Home1)" "= 0", size(medium) place(e)) 
    /* 

    Setting text placeholder for odds in date 
    representing BEFORE WEEK 16: 12/21/2015 

    */ 
    text(150 1766332800000 "P(Home1)" "= 0", size(medium) place(e)) 
    /* 

    Setting text placeholder for odds in date
    representing BEFORE WEEK 15: 12/28/2015 

    */ 
    text(150 1766908740000 "P(Home1)" "= 0", size(medium) place(e)) 
    /* 

    Setting text placeholder for odds in d
    ate representing BEFORE WEEK 15: 12/28/2015 

    */ 
    text(150 1766908740000 "P(Home1)" "= 0", size(medium) place(e)) 
    /* 

    Setting text placeholder to represent the line
    that denotes the Mean price

    */ 
    text(175 1767398340000 "Mean", size(small) color("7 46 95") place(e)) ;
    graph export "$outdir\Home1-Away1-Jan03.png", replace;
clear all;
macro drop _all;
以下代码用于单个
csv
文件

global directory "I:\Data\Useful CSVs"
global datadir "$directory\Games\GamesIndividual"
global outdir "I:\Data\figures"

/*********************************/
/*********************************/
/* Home1 vs. Away1 on January 3rd */
/*********************************/
/*********************************/

import delimited "$datadir\Home1 vs. Away1 on January 3rd", clear

/* Create a variable `eventtime` that captures the date portion 
from the `datetime` columns */

gen double eventtime = clock(scrapedatetime, "YMDhms")

/* Set time-series format */
tsset eventtime, format(%tcNN/DD/CCYY_HH:MM:SS)


/* The following code snippet gets the minimum and maximum raw date/time values, 
finds the interval between observations based on the desired steps 
(in this case 12), then loops over observations to get the date/time 
value at every step and inserts everything in a list: */

sort eventtime
summarize eventtime
local min = r(min)
local max = r(max)

local plus = _N / 5
local total = _N / `plus'

local dtlist `dtlist' `min'
local counter = 0

forvalues i = 1 / `total' {
    local counter = `counter' + `plus'
                local dtlist `dtlist' `=eventtime[`counter']' 
}

local dtlist `dtlist' `max'

/* You then draw a towway and append many connect lines.
The column variables are encoded differently when read into stata.

eventtime - ScrapeDate
median - Median Price in USD
thpercentile - 25th Percentile in USD
v7 - 75th Percentile in USD
mean - Mean Price in USD */
#delimit ;
twoway 
    (connected mean eventtime, msymbol(point) mfcolor(none)) 
    (connected median eventtime, msymbol(point) mfcolor(none))  
    (connected thpercentile eventtime, msymbol(point) mfcolor(none)) 
    (connected v7 eventtime, msymbol(point) mfcolor(none)), 
    title("Home1 vs. Away1 on January 3rd") 
    ytitle(Price in USD) 
    xtitle(Scrape Date) 
    leg(off)
    xlabel(`dtlist', format(%tCDDMon))
    xline(1765785540000 1766332800000 1766908740000, lwidth(thin))
    /* 

    Setting text placeholder for odds in date 
    representing BEFORE WEEK 15: 12/14/2015 

    */ 
    text(150 1765785540000 "P(Home1)" "= 0", size(medium) place(e)) 
    /* 

    Setting text placeholder for odds in date 
    representing BEFORE WEEK 16: 12/21/2015 

    */ 
    text(150 1766332800000 "P(Home1)" "= 0", size(medium) place(e)) 
    /* 

    Setting text placeholder for odds in date
    representing BEFORE WEEK 15: 12/28/2015 

    */ 
    text(150 1766908740000 "P(Home1)" "= 0", size(medium) place(e)) 
    /* 

    Setting text placeholder for odds in d
    ate representing BEFORE WEEK 15: 12/28/2015 

    */ 
    text(150 1766908740000 "P(Home1)" "= 0", size(medium) place(e)) 
    /* 

    Setting text placeholder to represent the line
    that denotes the Mean price

    */ 
    text(175 1767398340000 "Mean", size(small) color("7 46 95") place(e)) ;
    graph export "$outdir\Home1-Away1-Jan03.png", replace;
clear all;
macro drop _all;
该代码基于,并且运行良好

当我将完全相同的代码附加到相同的
do
文件时,但对于另一个
csv
文件:

/*********************************/
/*********************************/
/* Home2 vs. Away2 on January 3rd */
/*********************************/
/*********************************/

import delimited "$datadir\Home2 vs. Away2 on January 3rd", clear
剩下的代码直到
清除所有
宏删除所有
,类似于
Home1 vs.Away于1月3日
,因此生成了一个类似的图表,显示:

未找到eventtime无效语法

我相信这与删除或不读取每个
csv
文件中的变量有关,每个文件都有相同的变量名


将来,我想将18个
csv
文件的18个相同代码片段一个接一个地附加在一个
do
文件中,并执行相同的操作,将图形导出到特定的
outdir
(这对于第一个
csv
文件来说是正常的,但是当另一个
csv
文件的完全相同的代码被附加到为第一个文件创建和导出图形的代码下面时,会显示上述错误。

您需要恢复
do
文件末尾的回车分隔符:

clear all;
macro drop _all;
#delimit cr

否则,Stata将使用分号分隔符执行代码的其余部分。

您需要恢复
do
文件末尾的回车分隔符:

clear all;
macro drop _all;
#delimit cr
否则,Stata将使用分号分隔符执行代码的其余部分