Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/javascript/382.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/codeigniter/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Javascript 插入BigQuery时嵌套在数组中的JSON对象出现问题_Javascript_Arrays_Json_Node.js_Google Bigquery - Fatal编程技术网

Javascript 插入BigQuery时嵌套在数组中的JSON对象出现问题

Javascript 插入BigQuery时嵌套在数组中的JSON对象出现问题,javascript,arrays,json,node.js,google-bigquery,Javascript,Arrays,Json,Node.js,Google Bigquery,我很难理解我在筑巢时出了什么问题。 我想使用他们的Nodejs客户端在Google大查询中插入一行。为此,我需要推送嵌套在数组中的json对象,如下所示: [ { timestamp: '1533564208', device_id: '2nd_test', temperature: '20.0' } ] var payload = Buffer.from(pubsubMessage.data, 'base64').toString(); console.log(payload); // [

我很难理解我在筑巢时出了什么问题。 我想使用他们的Nodejs客户端在Google大查询中插入一行。为此,我需要推送嵌套在数组中的json对象,如下所示:

[ { timestamp: '1533564208', device_id: '2nd_test', temperature: '20.0' } ]
var payload = Buffer.from(pubsubMessage.data, 'base64').toString();
console.log(payload);
// [ { timestamp: '1533564208', device_id: '2nd_test', temperature: '20.0' } ] 
在我的代码中硬写时,我可以在大查询中添加一行而不会出现问题

const rows = [
{ timestamp: '1533564208', device_id: '2nd_test', temperature: '20.0' }
];

bigquery
.dataset(datasetId)
.table(tableId)
.insert(rows)
现在我真正想做的是在大查询中插入从pubsub获得的有效负载,这就是我面临的问题。 pubsub数据的管理如下所示:

[ { timestamp: '1533564208', device_id: '2nd_test', temperature: '20.0' } ]
var payload = Buffer.from(pubsubMessage.data, 'base64').toString();
console.log(payload);
// [ { timestamp: '1533564208', device_id: '2nd_test', temperature: '20.0' } ] 
根据日志,它应该可以工作-它是一个嵌套在数组中的JSON对象,但当它调用大查询API时,会遇到一个无效值错误:

error: ERROR: { ApiError: Invalid value at 'rows[0].json' (type.googleapis.com/google.protobuf.Struct), "[ { timestamp: '1533564208', device_id: '2nd_test', temperature: '20.0' } ]"
at Object.parseHttpRespBody (/Users/marion/google_iot_core_test/google_function/timeseriesBigQuery/node_modules/@google-cloud/bigquery/node_modules/@google-cloud/common/src/util.js:193:30)
at Object.handleResp (/Users/marion/google_iot_core_test/google_function/timeseriesBigQuery/node_modules/@google-cloud/bigquery/node_modules/@google-cloud/common/src/util.js:131:18)
at /Users/marion/google_iot_core_test/google_function/timeseriesBigQuery/node_modules/@google-cloud/bigquery/node_modules/@google-cloud/common/src/util.js:496:12
at Request.onResponse [as _callback] (/Users/marion/google_iot_core_test/google_function/timeseriesBigQuery/node_modules/retry-request/index.js:198:7)
at Request.self.callback (/Users/marion/google_iot_core_test/google_function/timeseriesBigQuery/node_modules/request/request.js:185:22)
at emitTwo (events.js:126:13)
at Request.emit (events.js:214:7)
at Request.<anonymous> (/Users/marion/google_iot_core_test/google_function/timeseriesBigQuery/node_modules/request/request.js:1161:10)
at emitOne (events.js:116:13)
at Request.emit (events.js:211:7)
code: 400,
errors: 
 [ { message: 'Invalid value at \'rows[0].json\' (type.googleapis.com/google.protobuf.Struct), "[ { timestamp: \'1533564208\', device_id: \'2nd_test\', temperature: \'20.0\' } ]"',
   domain: 'global',
   reason: 'badRequest' } ],
response: undefined,
message: 'Invalid value at \'rows[0].json\' (type.googleapis.com/google.protobuf.Struct), "[ { timestamp: \'1533564208\', device_id: \'2nd_test\', temperature: \'20.0\' } ]"' }
而不是

// { timestamp: '1533564208', device_id: '2nd_test', temperature: '20.0' }
但是当我试图解析它时,它会发出一个错误

你知道问题是什么吗


谢谢

正如评论中提到的,问题是我没有发送有效的JSON字符串。 很难理解的是,在下面的2个版本的代码中硬编码会给我相同的结果,并且工作良好

{ timestamp: "1533564208", device_id: "2nd_test", temperature: "20.0" } 

但是当使用PubSub Gcloud和Buffer函数时,我必须确保传入

'{"timestamp":"1533564208","device_id":"2nd_test","temperature":"20.0"}' 
而不是

{ timestamp: "1533564208", device_id: "2nd_test", temperature: "20.0" } 

否则它不会认为它是有效的JSON.< /P> <代码>行[0 ] < /代码>对象没有属性<代码> JSON<代码>。我有点想出来了,所以我尝试了几件事:把有效载荷仅仅是一个JSON对象,然后把它推到数组中,添加JSON.parse。也不起作用。奇怪的是,它在“rows[0].json”处显示无效值,尽管后面指定的值似乎完全有效:“[{timestamp:'1533564208',device_id:'2nd_test',temperature:'20.0'}]”我认为它是无效的json,如果它在字符串中,除非键和值都在引号中。你说得对!我想知道为什么。。。我发布了另一个问题: