Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/google-cloud-platform/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Google bigquery Google数据流作业和BigQuery在不同区域失败_Google Bigquery_Google Cloud Platform_Google Cloud Dataflow - Fatal编程技术网

Google bigquery Google数据流作业和BigQuery在不同区域失败

Google bigquery Google数据流作业和BigQuery在不同区域失败,google-bigquery,google-cloud-platform,google-cloud-dataflow,Google Bigquery,Google Cloud Platform,Google Cloud Dataflow,我的Google数据流作业在以下方面失败: BigQuery job ... finished with error(s): errorResult: Cannot read and write in different locations: source: EU, destination: US, error: Cannot read and write in different locations: source: EU, destination: US 我是从一开始就做这项工作的 --

我的Google数据流作业在以下方面失败:

BigQuery job ... finished with error(s): errorResult: 
Cannot read and write in different locations: source: EU, destination: US, error: Cannot read and write in different locations: source: EU, destination: US
我是从一开始就做这项工作的
--zone=europe-west1-b

这是管道中唯一对BigQuery执行任何操作的部分:

Pipeline p = Pipeline.create(options);
p.apply(BigQueryIO.Read.fromQuery(query));
我正在读取的BigQuery表的详细信息如下:
Data Location EU

当我在本地运行作业时,我得到:

SEVERE: Error opening BigQuery table  dataflow_temporary_table_339775 of dataset _dataflow_temporary_dataset_744662  : 404 Not Found
我不明白,如果我只是在读取数据,为什么它会尝试写入其他位置。即使它需要创建一个临时表,为什么要在不同的区域创建它呢


有什么想法吗?

我建议核实一下:

  • 如果Google数据流的暂存位置在同一区域中
  • 如果数据流中使用的Google云存储位置也在同一区域中