Apache spark 方解石JDBC火花集成

Apache spark 方解石JDBC火花集成,apache-spark,apache-calcite,Apache Spark,Apache Calcite,我需要在单个SQL查询中查询多个数据源。我看到方解石提供了一个JDBC适配器。运行此操作时,我遇到了以下挑战: 执行发生在内存中,这导致了大数据的OOM 我看到Spark有一个选择。如果启用此选项,将使用Spark执行查询吗 如果第二个问题的答案是“是”,则当我尝试使用Spark选项执行查询时,会出现以下异常: 请让我知道在大数据执行时如何克服OOM Java类 package com.sixdee.calcite; import java.sql.Connection; import jav

我需要在单个SQL查询中查询多个数据源。我看到方解石提供了一个JDBC适配器。运行此操作时,我遇到了以下挑战:

  • 执行发生在内存中,这导致了大数据的OOM

  • 我看到Spark有一个选择。如果启用此选项,将使用Spark执行查询吗

  • 如果第二个问题的答案是“是”,则当我尝试使用Spark选项执行查询时,会出现以下异常:

  • 请让我知道在大数据执行时如何克服OOM
  • Java类

    package com.sixdee.calcite;
    
    import java.sql.Connection;
    import java.sql.DriverManager;
    import java.sql.ResultSet;
    import java.sql.Statement;
    import java.util.Properties;
    import org.apache.calcite.util.Sources;
    public class MultiJDBCSchemaJoinTest {
        public static void main(String[] args) {
            Connection connection = null;
            Statement statement = null;
            ResultSet resultSet = null;
            try {
                Properties info = new Properties();
                info.put("model", jsonPath("model"));
                info.put("spark", "true");
                connection = DriverManager.getConnection("jdbc:calcite:", info);
                String sql = "SELECT SUB_DETAILS.MSISDN FROM DB1.SUB_DETAILS";
                statement = connection.createStatement();
                resultSet = statement.executeQuery(sql);
                while (resultSet.next()) {
                    System.out.println(resultSet.getString(1));
                }
            } catch (Exception exception) {
                exception.printStackTrace();
            } finally {
                if (resultSet != null) {
                    try {
                        resultSet.close();
                    } catch (Exception exception) {
                        exception.printStackTrace();
                    } finally {
                        resultSet = null;
                    }
                }
                if (statement != null) {
                    try {
                        statement.close();
                    } catch (Exception exception) {
                        exception.printStackTrace();
                    } finally {
                        statement = null;
                    }
                }
                if (connection != null) {
                    try {
                        connection.close();
                    } catch (Exception exception) {
                        exception.printStackTrace();
                    } finally {
                        connection = null;
                    }
                }
            }
    
        }
    
        public static String jsonPath(String model) {
            return resourcePath(model + ".json");
        }
    
        public static String resourcePath(String path) {
            return Sources.of(MultiJDBCSchemaJoinTest.class.getResource("/" + path)).file().getAbsolutePath();
        }
    
    }
    
    型号

    {
      "version": "1.0",
      "defaultSchema": "DB",
      "schemas": [ {
        "type": "jdbc",
        "name": "DB1",
        "jdbcUser": "root",
        "jdbcPassword": "admin",
        "jdbcUrl": "jdbc:mysql://localhost:3306/ignite",
        "jdbcSchema": "ignite"
      }, {
        "type": "jdbc",
        "name": "DB2",
        "jdbcUser": "SYSTEM",
        "jdbcPassword": "SYSTEM",
        "jdbcUrl": "jdbc:oracle:thin:@localhost:1521:xe",
        "jdbcSchema": "SYSTEM"
      }]
    }
    
    

    如果您想帮助调试代码,请提供代码。请在问题部分找到代码详细信息。
    {
      "version": "1.0",
      "defaultSchema": "DB",
      "schemas": [ {
        "type": "jdbc",
        "name": "DB1",
        "jdbcUser": "root",
        "jdbcPassword": "admin",
        "jdbcUrl": "jdbc:mysql://localhost:3306/ignite",
        "jdbcSchema": "ignite"
      }, {
        "type": "jdbc",
        "name": "DB2",
        "jdbcUser": "SYSTEM",
        "jdbcPassword": "SYSTEM",
        "jdbcUrl": "jdbc:oracle:thin:@localhost:1521:xe",
        "jdbcSchema": "SYSTEM"
      }]
    }