Jdbc Pentaho:我无法将数据从Pentaho写入BigQuery

Jdbc Pentaho:我无法将数据从Pentaho写入BigQuery,jdbc,google-bigquery,pentaho-spoon,pentaho-data-integration,Jdbc,Google Bigquery,Pentaho Spoon,Pentaho Data Integration,我正在使用Starschema的JDBC驱动程序将Pentaho连接到BigQuery。我能够成功地将数据从BigQuery提取到Pentaho。但是,我无法将Pentaho中的数据写入BigQuery。将行插入BigQuery时引发异常,似乎不支持该操作。我如何解决这个问题 错误消息: 2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by b

我正在使用Starschema的JDBC驱动程序将Pentaho连接到BigQuery。我能够成功地将数据从BigQuery提取到Pentaho。但是,我无法将Pentaho中的数据写入BigQuery。将行插入BigQuery时引发异常,似乎不支持该操作。我如何解决这个问题

错误消息:

2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Because of an error, this step can't continue:
2017/10/30 14:27:43 - Table output 2.0 - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleException: 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting row into table [TableID] with values: [A], [I], [G], [1], [2016-02-18], [11], [2016-02-18-12.00.00.123456], [GG], [CB], [132], [null], [null], [null]
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 -    at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:385)
2017/10/30 14:27:43 - Table output 2.0 -    at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:125)
2017/10/30 14:27:43 - Table output 2.0 -    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/10/30 14:27:43 - Table output 2.0 -    at java.lang.Thread.run(Unknown Source)
2017/10/30 14:27:43 - Table output 2.0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Table output 2.0 - Error inserting/updating row
2017/10/30 14:27:43 - Table output 2.0 - executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 - 
2017/10/30 14:27:43 - Table output 2.0 -    at org.pentaho.di.core.database.Database.insertRow(Database.java:1321)
2017/10/30 14:27:43 - Table output 2.0 -    at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:262)
2017/10/30 14:27:43 - Table output 2.0 -    ... 3 more
2017/10/30 14:27:43 - Table output 2.0 - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: executeUpdate()
2017/10/30 14:27:43 - Table output 2.0 -    at net.starschema.clouddb.jdbc.BQPreparedStatement.executeUpdate(BQPreparedStatement.java:317)
2017/10/30 14:27:43 - Table output 2.0 -    at org.pentaho.di.core.database.Database.insertRow(Database.java:1288)
2017/10/30 14:27:43 - Table output 2.0 -    ... 4 more
2017/10/30 14:27:43 - BigQuery_rwa-tooling - Statement canceled!
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : Something went wrong while trying to stop the transformation: org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt - ERROR (version 7.1.0.0-12, build 1 from 2017-05-16 17.18.02 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseException: 
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Error cancelling statement
2017/10/30 14:27:43 - Simple Read Write from csv to txt - cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt - 
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at org.pentaho.di.core.database.Database.cancelStatement(Database.java:750)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at org.pentaho.di.core.database.Database.cancelQuery(Database.java:732)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at org.pentaho.di.trans.steps.tableinput.TableInput.stopRunning(TableInput.java:299)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at org.pentaho.di.trans.Trans.stopAll(Trans.java:1889)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at org.pentaho.di.trans.step.BaseStep.stopAll(BaseStep.java:2915)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:139)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at java.lang.Thread.run(Unknown Source)
2017/10/30 14:27:43 - Simple Read Write from csv to txt - Caused by: net.starschema.clouddb.jdbc.BQSQLFeatureNotSupportedException: cancel()
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at net.starschema.clouddb.jdbc.BQStatementRoot.cancel(BQStatementRoot.java:113)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   at org.pentaho.di.core.database.Database.cancelStatement(Database.java:744)
2017/10/30 14:27:43 - Simple Read Write from csv to txt -   ... 7 more
2017/10/30 14:27:43 - Table output 2.0 - Signaling 'output done' to 0 output rowsets.
2017/10/30 14:27:43 - BigQuery_prID - No commit possible on database connection [BigQuery_prID]

看起来您可能正试图通过旧式SQL实现这一点,旧式SQL不支持DML语句(INSERT/UPDATE/DELETE)

标准SQL确实支持DML,但它们主要是支持批量表操作,而不是面向行的插入;不建议通过使用单个DML插入来摄取数据。有关详细信息,请参阅上的配额

您最好通过加载作业使用BigQuery流或批量接收进行接收,但由于这些机制不属于查询语言,您可能需要超越使用JDBC驱动程序