Hive 如何将配置单元查询转换为抽象语法树?

Hive 如何将配置单元查询转换为抽象语法树?,hive,abstract-syntax-tree,Hive,Abstract Syntax Tree,谁能告诉我如何将配置单元查询转换为抽象语法树? 例如:从cust_num=100的订单中选择*; 如何将其转换为AST?我怎样才能把这个AST转换成QB树呢?请帮忙。提前感谢。您可以使用扩展的使用解释命令。假设我有一个名为demo的表,其中一列为n1,那么发布Explain将给出以下结果: hive> EXPLAIN EXTENDED select * from demo where n1='aaa'; OK ABSTRACT SYNTAX TREE: (TOK_QUERY (TOK_

谁能告诉我如何将配置单元查询转换为抽象语法树? 例如:从cust_num=100的订单中选择*;
如何将其转换为AST?我怎样才能把这个AST转换成QB树呢?请帮忙。提前感谢。

您可以使用
扩展的
使用
解释
命令。假设我有一个名为
demo
的表,其中一列为
n1
,那么发布Explain将给出以下结果:

hive> EXPLAIN EXTENDED select * from demo where n1='aaa';
OK
ABSTRACT SYNTAX TREE:
  (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME demo))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR TOK_ALLCOLREF)) (TOK_WHERE (= (TOK_TABLE_OR_COL n1) 'aaa'))))

STAGE DEPENDENCIES:
  Stage-1 is a root stage
  Stage-0 is a root stage

STAGE PLANS:
  Stage: Stage-1
    Map Reduce
      Alias -> Map Operator Tree:
        demo 
          TableScan
            alias: demo
            GatherStats: false
            Filter Operator
              isSamplingPred: false
              predicate:
                  expr: (n1 = 'aaa')
                  type: boolean
              Select Operator
                expressions:
                      expr: n1
                      type: string
                      expr: n2
                      type: string
                outputColumnNames: _col0, _col1
                File Output Operator
                  compressed: false
                  GlobalTableId: 0
                  directory: hdfs://localhost:9000/tmp/hive-apache/hive_2013-06-13_19-55-21_578_6086176948010779575/-ext-10001
                  NumFilesPerFileSink: 1
                  Stats Publishing Key Prefix: hdfs://localhost:9000/tmp/hive-apache/hive_2013-06-13_19-55-21_578_6086176948010779575/-ext-10001/
                  table:
                      input format: org.apache.hadoop.mapred.TextInputFormat
                      output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
                      properties:
                        columns _col0,_col1
                        columns.types string:string
                        escape.delim \
                        serialization.format 1
                  TotalFiles: 1
                  GatherStats: false
                  MultiFileSpray: false
      Needs Tagging: false
      Path -> Alias:
        hdfs://localhost:9000/user/hive/warehouse/demo [demo]
      Path -> Partition:
        hdfs://localhost:9000/user/hive/warehouse/demo 
          Partition
            base file name: demo
            input format: org.apache.hadoop.mapred.TextInputFormat
            output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
            properties:
              bucket_count -1
              columns n1,n2
              columns.types string:string
              field.delim ,
              file.inputformat org.apache.hadoop.mapred.TextInputFormat
              file.outputformat org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
              location hdfs://localhost:9000/user/hive/warehouse/demo
              name default.demo
              serialization.ddl struct demo { string n1, string n2}
              serialization.format ,
              serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
              transient_lastDdlTime 1370932655
            serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

              input format: org.apache.hadoop.mapred.TextInputFormat
              output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
              properties:
                bucket_count -1
                columns n1,n2
                columns.types string:string
                field.delim ,
                file.inputformat org.apache.hadoop.mapred.TextInputFormat
                file.outputformat org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
                location hdfs://localhost:9000/user/hive/warehouse/demo
                name default.demo
                serialization.ddl struct demo { string n1, string n2}
                serialization.format ,
                serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
                transient_lastDdlTime 1370932655
              serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
              name: default.demo
            name: default.demo

  Stage: Stage-0
    Fetch Operator
      limit: -1


Time taken: 5.316 seconds