Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/google-sheets/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 清管器投掷不兼容类型错误_Hadoop_Hive_Apache Pig_Bigdata - Fatal编程技术网

Hadoop 清管器投掷不兼容类型错误

Hadoop 清管器投掷不兼容类型错误,hadoop,hive,apache-pig,bigdata,Hadoop,Hive,Apache Pig,Bigdata,我使用下面的代码通过在datafu中使用sessionize UDF在pig中生成sessionId SET mapred.min.split.size 1073741824 SET mapred.job.queue.name 'marathon' SET mapred.output.compress true; --SET avro.output.codec snappy; --SET pig.maxCombinedSpli

我使用下面的代码通过在datafu中使用sessionize UDF在pig中生成sessionId

 SET       mapred.min.split.size 1073741824   
SET       mapred.job.queue.name 'marathon'
 SET       mapred.output.compress true;
--SET       avro.output.codec snappy;
--SET       pig.maxCombinedSplitSize 536870912;



page_view_pre = LOAD '/data/tracking/PageViewEvent/' USING LiAvroStorage('date.range','start.date=20150226;end.date=20150226;error.on.missing=true');  -----logic is currently for 2015-02-26,will later replace them with date parameters
p_key = LOAD '/projects/dwh/dwh_dim/dim_page_key/#LATEST' USING LiAvroStorage();


page_view_pre = FILTER page_view_pre  BY (requestHeader.userAgent != 'CRAWLER' and requestHeader.browserId != 'CRAWLER') and NOT IsTestMemberId(header.memberId);


page_view_pre = FOREACH page_view_pre GENERATE 
            (int)   (header.memberId <0 ? -9 : header.memberId )           as member_sk,
            (chararray)  requestHeader.browserId                               as browserId,
    --(chararray)       requestHeader.sessionId                                    as sessionId,
(chararray)         UnixToISO(header.time)                                     as pageViewTime,
            header.time                                                    as pv_time,
    (chararray) requestHeader.path                                 as path,
    (chararray)     requestHeader.referer                                  as referer,
    (chararray)  epochToFormat(header.time, 'yyyyMMdd', 'America/Los_Angeles') as tracking_date,
(chararray)         requestHeader.pageKey                                      as pageKey,
    (chararray)        SUBSTRING(requestHeader.trackingCode, 0, 500)           as trackingCode,  
        FLATTEN(botLookup(requestHeader.userAgent, requestHeader.browserId))   as (is_crawler, crawler_type),
    (int)   totalTime                                                                  as totalTime,
    ((int) totalTime < 20 ? 1 :0)                                                    as bounce_flag;    

page_view_pre = FILTER page_view_pre BY is_crawler == 'N' ;

p_key = FILTER p_key By is_aggregate ==1;

page_view_agg = JOIN page_view_pre by pageKey ,p_key by page_key;

page_view_agg = FOREACH page_view_agg GENERATE
                 (chararray)page_view_pre::member_sk as member_sk,
                 (chararray)page_view_pre::browserId as browserId,
                --page_view_pre::sessionId as sessionId,
                (chararray)page_view_pre::pageViewTime as pageViewTime,
                (long)page_view_pre::pv_time as pv_time,
                (chararray)page_view_pre::tracking_date as tracking_date,
                (chararray)page_view_pre::path as path,
                (chararray)page_view_pre::referer as referer,
                (chararray)page_view_pre::pageKey as pageKey,
                (int)p_key::page_key_sk as page_key_sk,
                (chararray)page_view_pre::trackingCode as trackingCode,
                (int)page_view_pre::totalTime as totalTime,
                (int)page_view_pre::bounce_flag as bounce_flag;

page_view_agg = FILTER page_view_agg By (member_sk is NOT null) OR (browserId IS NOT NULL) ;        

pvs_by_member_browser_pair = GROUP page_view_agg BY (member_sk,browserId);



***session_groups = FOREACH pvs_by_member_browser_pair {
                                visits = ORDER page_view_agg BY pv_time;
                                GENERATE FLATTEN(Sessionize(visits)) AS (
                                pageViewTime,member_sk, pv_time,tracking_date, pageKey,page_key_sk,browserId,referer ,path, trackingCode,totalTime, sessionId
                                                                        );
                                                    }***
设置mapred.min.split.size 1073741824
将mapred.job.queue.name设置为“marathon”
将mapred.output.compress设置为true;
--将avro.output.codec设置为snappy;
--设置pig.maxCombinedSplitSize 536870912;
page_view_pre=LOAD'/data/tracking/PageViewEvent/'使用LiAvroStorage('date.range','start.date=20150226;end.date=20150226;error.on.missing=true')-----逻辑目前适用于2015-02-26,稍后将用日期参数替换
p_key=LOAD'/projects/dwh/dwh_dim/dim_page_key/#LATEST'使用LiAvroStorage();
页面\u视图\u pre=筛选页面\u视图\u pre BY(requestHeader.userAgent!=“CRAWLER”和requestHeader.browserId!=“CRAWLER”)而不是IsTestMemberId(header.memberId);
页面\视图\预生成=每个页面\视图\预生成

(int)(header.memberId这是模式不匹配的典型情况:

page_view_pre = LOAD '/data/tracking/PageViewEvent/' USING LiAvroStorage('date.range','start.date=20150226;end.date=20150226;error.on.missing=true');  -----logic is currently for 2015-02-26,will later replace them with date parameters
只要在这一行后面加上
说明页面\u视图\u pre
,就可以了解模式