Apache spark WSO2 DAS Spark脚本无法执行

Apache spark WSO2 DAS Spark脚本无法执行,apache-spark,wso2,apache-spark-sql,wso2-das,Apache Spark,Wso2,Apache Spark Sql,Wso2 Das,下面是我正在尝试执行的spark脚本。它在DAS(3.0.1)批量分析控制台上成功运行。但在batch analytics中保存为脚本时无法执行 insert overwrite table CLASS_COUNT select ((timestamp / 120000) * 120000) as time , vin , username , classType, sum(acceleCount) as acceleCount , sum(decceleCount) as d

下面是我正在尝试执行的spark脚本。它在DAS(3.0.1)批量分析控制台上成功运行。但在batch analytics中保存为脚本时无法执行

insert overwrite table CLASS_COUNT select  ((timestamp / 120000) * 120000) as time , vin , username , classType,        
sum(acceleCount) as acceleCount , sum(decceleCount) as decceleCount
from ACCELE_COUNTS
group by ((timestamp / 120000) * 120000) ,classType, vin, username;
错误:

ERROR: [1.199] failure: ``limit'' expected but identifier ACCELE_COUNTSgroup found insert overwrite table X1234_CLASS_COUNT select ((timestamp / 120000) * 120000) as time , vin , username , classType, sum(acceleCount) as acceleCount , sum(decceleCount) as decceleCountfrom ACCELE_COUNTSgroup by ((timestamp / 120000) * 120000) ,classType, vin, username ^
在此之前,我执行以下没有任何问题

CREATE TEMPORARY TABLE ACCELE_COUNTS 
USING CarbonAnalytics 
OPTIONS (tableName "KAMPANA_RECKLESS_COUNT_STREAM", 
     schema "timestamp LONG , vin STRING, username STRING, classType STRING, acceleCount INT,decceleCount INT");

CREATE TEMPORARY TABLE CLASS_COUNT 
USING org.wso2.carbon.analytics.spark.event.EventStreamProvider 
OPTIONS (receiverURL "tcp://localhost:7611",
     username "admin",
     password "admin",
     streamName "DAS_RECKELSS_COUNT_STREAM", 
     version "1.0.0",
     description "Events are published  when product quantity goes beyond a certain level",
     nickName "product alerts",
     payload "time LONG,vin STRING,username STRING, classType STRING, acceleCount INT, decceleCount INT"
);

这种情况发生的原因是两个空间之间没有空格

1)
decceleCount
from

2)
加速计数
分组依据


因此,即使第二个单词在新行中,也要确保单词之间有空格。

这是因为单词之间没有空格

1)
decceleCount
from

2)
加速计数
分组依据

因此,即使第二个单词在新行中,也要确保单词之间有空格