Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/file/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用Spark SQL时出现问题【;2.1和x3011;使用PostgreSQL数据库的步骤_Postgresql_Apache Spark - Fatal编程技术网

使用Spark SQL时出现问题【;2.1和x3011;使用PostgreSQL数据库的步骤

使用Spark SQL时出现问题【;2.1和x3011;使用PostgreSQL数据库的步骤,postgresql,apache-spark,Postgresql,Apache Spark,我使用以下测试用例将数据写入postgresql表,它运行良好 测试(“SparkSQLTest”){ val session=SparkSession.builder().master(“本地”).appName(“SparkSQLTest”).getOrCreate() val url=“jdbc:postgresql://dbhost:12345/db1" val table=“schema1.table1” val props=新属性() 道具放置(“用户”、“用户123”) 道具。放

我使用以下测试用例将数据写入postgresql表,它运行良好


测试(“SparkSQLTest”){
val session=SparkSession.builder().master(“本地”).appName(“SparkSQLTest”).getOrCreate()
val url=“jdbc:postgresql://dbhost:12345/db1"
val table=“schema1.table1”
val props=新属性()
道具放置(“用户”、“用户123”)
道具。放置(“密码”pass@123")
put(jdbchoptions.JDBC_DRIVER_类,“org.postgresql.DRIVER”)
session.range(300400).write.mode(SaveMode.Append).jdbc(url、表格、道具)
}

然后,我使用以下
sparksql-fsql\u script\u file.sql
将配置单元数据写入postgresql表

CREATE OR REPLACE TEMPORARY VIEW tmp_v1
USING org.apache.spark.sql.jdbc
OPTIONS (
  driver 'org.postgresql.Driver',
  url 'jdbc:postgresql://dbhost:12345/db1',
  dbtable 'schema1.table2',
  user 'user123',
  password 'pass@123',
  batchsize '2000'
);

insert into tmp_v1 select
name,
age
from test.person; ---test.person is the Hive db.table
但是当我使用
sparksql-fsql\u script.sql
运行上述脚本时,它抱怨postgresql用户/passord无效,例外如下,我认为上述两种方法基本相同,因此我想问问题出在哪里,谢谢

org.postgresql.util.PSQLException: FATAL: Invalid username/password,login denied.
at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:375)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:189)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:64)
at org.postgresql.jdbc2.AbstractJdbc2Connection.<init>(AbstractJdbc2Connection.java:124)
at org.postgresql.jdbc3.AbstractJdbc3Connection.<init>(AbstractJdbc3Connection.java:28)
at org.postgresql.jdbc3g.AbstractJdbc3gConnection.<init>(AbstractJdbc3gConnection.java:20)
at org.postgresql.jdbc4.AbstractJdbc4Connection.<init>(AbstractJdbc4Connection.java:30)
at org.postgresql.jdbc4.Jdbc4Connection.<init>(Jdbc4Connection.java:22)
at org.postgresql.Driver.makeConnection(Driver.java:392)
at org.postgresql.Driver.connect(Driver.java:266)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:59)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:50)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:58)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:114)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:45)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330)
at org.apache.spark.sql.execution.datasources.CreateTempViewUsing.run(ddl.scala:76)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:59)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:75)
org.postgresql.util.psql异常:致命:用户名/密码无效,登录被拒绝。
位于org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:375)
位于org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:189)
位于org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:64)
位于org.postgresql.jdbc2.AbstractJdbc2Connection。(AbstractJdbc2Connection.java:124)
位于org.postgresql.jdbc3.AbstractJdbc3Connection。(AbstractJdbc3Connection.java:28)
位于org.postgresql.jdbc3g.AbstractJdbc3gConnection。(AbstractJdbc3gConnection.java:20)
位于org.postgresql.jdbc4.AbstractJdbc4Connection。(AbstractJdbc4Connection.java:30)
位于org.postgresql.jdbc4.Jdbc4Connection(Jdbc4Connection.java:22)
位于org.postgresql.Driver.makeConnection(Driver.java:392)
位于org.postgresql.Driver.connect(Driver.java:266)
位于org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:59)
位于org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:50)
位于org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:58)
位于org.apache.spark.sql.execution.datasources.jdbc.jdbcreation.(jdbcreation.scala:114)
位于org.apache.spark.sql.execution.datasources.jdbc.jdbrelationprovider.createRelation(jdbrelationprovider.scala:45)
位于org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330)
位于org.apache.spark.sql.execution.datasources.CreateTempViewUsing.run(ddl.scala:76)
位于org.apache.spark.sql.execution.command.executeCommandExec.sideEffectResult$lzycompute(commands.scala:59)
位于org.apache.spark.sql.execution.command.executeCommandExec.sideEffectResult(commands.scala:57)
位于org.apache.spark.sql.execution.command.executeCommandExec.doExecute(commands.scala:75)