Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/node.js/42.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Node.js 将数据流式传输到BigQuery时没有错误响应_Node.js_Error Handling_Google Bigquery - Fatal编程技术网

Node.js 将数据流式传输到BigQuery时没有错误响应

Node.js 将数据流式传输到BigQuery时没有错误响应,node.js,error-handling,google-bigquery,Node.js,Error Handling,Google Bigquery,我有一个简单的nodejs脚本,它将数据插入到bigquery表中。当数据输入正确时,它就像一个魔咒。当数据输入不正确时,我不会得到错误响应 如何处理错误响应 'use strict' // Imports the Google Cloud client library const BigQuery = require('@google-cloud/bigquery'); //Instantiates a client const bigquery = BigQuery({ projec

我有一个简单的nodejs脚本,它将数据插入到bigquery表中。当数据输入正确时,它就像一个魔咒。当数据输入不正确时,我不会得到错误响应

如何处理错误响应

'use strict'

// Imports the Google Cloud client library
const BigQuery = require('@google-cloud/bigquery');

//Instantiates a client
const bigquery = BigQuery({
  projectId: 'projectid',
  keyFilename: './credentials/cloudkey.json'
});

const datasetId = 'datasetId'
const tableId = 'test'

const rows = [
  {"data": "data"}, //correct row
  {"dtaa": "test"} // incorrect row
]

insertRowsAsStream(datasetId, tableId, rows)
.then( insertErrors  => {
  console.log(insertErrors)
})

function insertRowsAsStream (datasetId, tableId, rows) {

  // References an existing dataset, e.g. "my_dataset"
  const dataset = bigquery.dataset(datasetId);
  // References an existing dataset, e.g. "my_dataset"
  const table = dataset.table(tableId);
  //console.log(table)
  // Inserts data into a table
  return table.insert(rows)
    .then((insertErrors) => {
      console.log('Inserted:');
      rows.forEach((row) => console.log(row));
      return insertErrors;
    });
}

愚蠢的错误!我应该在insertRowAsStream之后使用catch函数

insertRowsAsStream(datasetId, tableId, rows)
.catch( insertErrors  => {
  console.log(insertErrors)
})

每个请求要插入多少行?中列出的限制可能是您看到的行为的原因