Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/377.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/three.js/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
无法使用Java SocketFactory库连接到云SQL_Java_Scala_Maven_Apache Spark - Fatal编程技术网

无法使用Java SocketFactory库连接到云SQL

无法使用Java SocketFactory库连接到云SQL,java,scala,maven,apache-spark,Java,Scala,Maven,Apache Spark,我正在尝试使用java代码连接到云SQL(Mysql)。我得到以下错误- com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create socket factory 'com.google.cloud.sql.mysql.SocketFactory' due to underlying exception: at sun.reflect.NativeConstruct

我正在尝试使用java代码连接到云SQL(Mysql)。我得到以下错误-

com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create socket factory 'com.google.cloud.sql.mysql.SocketFactory' due to underlying exception: 
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.google.cloud.sql.mysql.SocketFactory
这是我的密码-

package utils
import java.sql.DriverManager
import java.sql.Connection
import scala.collection.mutable.ListBuffer
import entity.AnalyticFieldEntity
import compute.driver.AnalyticTools
import entity.ErrorHandlingEntity

object ScalaDbConnect {

    def getAnalyticBatchMap(toolId : Int, paramMap: Map[String, String]): Map[String, Int] = {
    val methodName = "getAnalyticBatchMap"
    val errorMode = paramMap.get("mode")+"("+paramMap.get("analyticSource")+")"

    val dbTuple = DbPropertiesReader.getDbProperties()


      val databaseName = dbTuple._3
      val instanceConnectionName = dbTuple._4
      val username= dbTuple._1
      val password= dbTuple._2


    var connection: Connection = null
    val analyticMap = collection.mutable.Map.empty[String, Int]
    try {
      //[START doc-example]
      val jdbcUrl = String.format(
        "jdbc:mysql://google/%s?cloudSqlInstance=%s&"
            + "socketFactory=com.google.cloud.sql.mysql.SocketFactory", databaseName, instanceConnectionName);

      println(jdbcUrl);
      //Class.forName("com.mysql.jdbc.GoogleDriver");
      val connection = DriverManager.getConnection(jdbcUrl, username, password);
      println(connection);

    //[END doc-example]

      try 
      {

        val statement = connection.createStatement()
        val resultSet = statement.executeQuery("SELECT omnitureColumnHeader.columnHeaderId, case when analyticFieldMap.isTag = 1 then concat(\"tag_\",analyticFieldMap.entityField) else  " +
        "analyticFieldMap.entityField  end as entityField FROM omnitureColumnHeader INNER JOIN analyticFieldMap ON " +
        "analyticFieldMap.analyticFieldBatch=omnitureColumnHeader.columnHeaderValue where analyticFieldMap.toolId = " + toolId);

        System.out.println("resultSet: 2" + statement);
        System.out.println("statement: 2" + resultSet);


        while (resultSet.next()) {

                    System.out.println("inside the content loop: 2");
          analyticMap += resultSet.getString("entityField") -> resultSet.getInt("columnHeaderId")


           }


        System.out.println("analyticMap: 2" + analyticMap);
    }
      catch
      {
         case _: Throwable => println("Got some other kind of exception")
      }


    } catch {
      case e: Exception =>
               val errorHandlingEntity = new ErrorHandlingEntity()
                  errorHandlingEntity.Mode=errorMode
                  errorHandlingEntity.Tool=paramMap.get("tool").toString()
                  errorHandlingEntity.Message="DB Connection Issue"
                  errorHandlingEntity.Trace=e.printStackTrace().toString()
                  errorHandlingEntity.Source = "Spark"
                  errorHandlingEntity.YarnAppId=paramMap.get("appID").toString()
                  errorHandlingEntity.MethodName=methodName
                  errorHandlingEntity.ReThrow = true
                  errorHandlingEntity.CurrentException=e

                  ErrorHandlingFramework.HandleException(errorHandlingEntity)
    }
    connection.close()

    analyticMap.toMap

  }
}
我在POM.XML中添加了以下详细信息

<dependency>
    <groupId>com.google.cloud.sql</groupId>
    <artifactId>mysql-socket-factory</artifactId>
    <version>1.0.3</version>
</dependency>

com.google.cloud.sql

我正在尝试使用scala代码和javaapi连接到googlecloudsql。 我面临的问题表明,我无法访问连接的正确类

任何帮助都将不胜感激

期待解决方案


谢谢,

问题在于谷歌云如何运行maven构建

它无法从构建中读取类,所以我用扩展名--JARS传递了这些JAR文件


这解决了我的问题

问题在于谷歌云如何运行maven构建

它无法从构建中读取类,所以我用扩展名--JARS传递了这些JAR文件


这解决了我的问题

正如GitHub问题中所讨论的,请包含pom.xml文件的片段,其中显示了如何配置程序集插件。这看起来像是一个jar文件没有正确打包的一般问题,可能与这个库不相关。您好,请在这里找到我的POM.XML-我也通过打开检查了jar,我可以找到相关的classes.POM文件对我来说是正确的。因为pom文件看起来是正确的,所以可能是Spark加载/执行事物的方式有问题。我不熟悉Spark,所以我不能帮你。如果你在google上搜索classnotfoundexception之类的东西,你会发现一些点击率,例如:非常感谢,它解决了我的问题。奇怪的是,spark无法解析依赖的JAR,尽管JAR在maven构建中可用。通过使用--Jars扩展提供缺少的jar,问题得以解决。感谢GitHub问题中讨论的内容,请包含pom.xml文件的片段,其中显示了如何配置程序集插件。这看起来像是一个jar文件没有正确打包的一般问题,可能与这个库不相关。您好,请在这里找到我的POM.XML-我也通过打开检查了jar,我可以找到相关的classes.POM文件对我来说是正确的。因为pom文件看起来是正确的,所以可能是Spark加载/执行事物的方式有问题。我不熟悉Spark,所以我不能帮你。如果你在google上搜索classnotfoundexception之类的东西,你会发现一些点击率,例如:非常感谢,它解决了我的问题。奇怪的是,spark无法解析依赖的JAR,尽管JAR在maven构建中可用。通过使用--Jars扩展提供缺少的jar,问题得以解决。谢谢