Apache spark Spark在GKE上部署时无法连接到mysql

Apache spark Spark在GKE上部署时无法连接到mysql,apache-spark,kubernetes,apache-spark-sql,google-kubernetes-engine,Apache Spark,Kubernetes,Apache Spark Sql,Google Kubernetes Engine,我正在GKE的Kubernetes上部署一个批量spark作业。 Job试图从MySQL(Google Cloud SQL)中获取一些数据,但导致连接失败。 我试图通过从pod安装mysql客户端来手动连接mysql,但连接良好。 是否有其他需要配置的内容 例外情况: Exception in thread "main" com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure The l

我正在GKE的Kubernetes上部署一个批量spark作业。 Job试图从MySQL(Google Cloud SQL)中获取一些数据,但导致连接失败。 我试图通过从pod安装mysql客户端来手动连接mysql,但连接良好。 是否有其他需要配置的内容

例外情况:

Exception in thread "main" com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.

at com.mysql.cj.jdbc.exceptions.SQLError.createCommunicationsException(SQLError.java:590)
        at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:57)
        at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:1606)
        at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:633)
        at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:347)
        at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:219)
        at org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:56)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:210)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)
        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
        at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)

线程“main”com.mysql.cj.jdbc.exceptions.CommunicationsException中的异常:通信链路故障 成功发送到服务器的最后一个数据包是0毫秒前的。驱动程序尚未从服务器收到任何数据包。 位于com.mysql.cj.jdbc.exceptions.SQLError.createCommunicationsException(SQLError.java:590) 位于com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:57) 位于com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:1606) 位于com.mysql.cj.jdbc.ConnectionImpl.(ConnectionImpl.java:633) 位于com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:347) 位于com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:219) 位于org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45) 位于org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63) 位于org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54) 位于org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:56) 位于org.apache.spark.sql.execution.datasources.jdbc.jdbcrations$.getSchema(jdbcrations.scala:210) 位于org.apache.spark.sql.execution.datasources.jdbc.jdbrelationprovider.createRelation(jdbrelationprovider.scala:35) 位于org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318) 位于org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223) 位于org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211) 位于org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
问题实际上与GCP中的防火墙规则有关。现在工作很好