Authentication ML Engine BigQuery:请求的身份验证作用域不足

Authentication ML Engine BigQuery:请求的身份验证作用域不足,authentication,google-bigquery,google-cloud-ml-engine,Authentication,Google Bigquery,Google Cloud Ml Engine,我正在运行tensorflow模型,提交关于ml引擎的培训。我已经构建了一个管道,它使用tf.contrib.cloud.python.ops.BigQuery\u reader\u ops.BigQueryReade作为队列的读卡器从BigQuery读取 使用DataLab和local,设置指向json文件的GOOGLE\u应用程序\u凭证变量作为凭证密钥,一切正常。但是,当我在云中提交培训作业时,我会遇到以下错误(我只发布了两个主要错误): 权限被拒绝:读取架构时执行HTTP请求(HTTP响

我正在运行tensorflow模型,提交关于ml引擎的培训。我已经构建了一个管道,它使用tf.contrib.cloud.python.ops.BigQuery\u reader\u ops.BigQueryReade作为队列的读卡器从BigQuery读取

使用DataLablocal,设置指向json文件的GOOGLE\u应用程序\u凭证变量作为凭证密钥,一切正常。但是,当我在云中提交培训作业时,我会遇到以下错误(我只发布了两个主要错误):

  • 权限被拒绝:读取架构时执行HTTP请求(HTTP响应代码403,错误代码0,错误消息“”)时出错

  • 创建模型时出错。检查详细信息:请求的身份验证作用域不足

  • 我已经检查了所有其他内容,比如在脚本和项目/数据集/表ID/名称中正确定义表架构

    为了更清楚,我将日志中存在的整个错误粘贴到这里:

    消息:“回溯(最近一次呼叫最后一次):

    文件“/usr/lib/python2.7/runpy.py”,第174行,在运行模块中作为主模块
    “\uuuuu main\uuuuuuuuuuuuuuuuuuuuuuuuu”,fname,loader,pkg\u name)
    文件“/usr/lib/python2.7/runpy.py”,第72行,在运行代码中
    run_globals中的exec代码
    文件“/root/.local/lib/python2.7/site packages/trainer/task.py”,第131行,在
    hparams=hparam.hparams(**参数)
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/contrib/learn/python/learn/learn\u runner.py”,第210行,运行中
    返回执行时间表(实验,时间表)
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/contrib/learn/python/learn/learn\u runner.py”,执行计划中第47行
    返回任务()
    文件“/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/experience.py”,第495行,列车和列车中
    自动列车(延迟秒=0)
    文件“/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/experience.py”,第275行,列车中
    挂钩=自身。\列车\监控器+额外\挂钩)
    文件“/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/experience.py”,第665行,在调用列车中
    监视器=挂钩)
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/python/util/deprecation.py”,第289行,在new_func中
    返回函数(*args,**kwargs)
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py”,第455行
    损耗=自身。\列车\模型(输入\ fn=输入\ fn,挂钩=挂钩)
    文件“/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py”,第1007行,列车模型
    _,损耗=单次运行([model\u fn\u ops.train\u op,model\u fn\u ops.loss])
    文件“/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/monitored\u session.py”,第521行,在退出中__
    自我关闭内部(例外类型)
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/python/training/monitored_session.py”,第556行,在内部关闭
    self.\u sess.close()
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/python/training/monitored_session.py”,第791行,关闭
    self.\u sess.close()
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/python/training/monitored_session.py”,第888行,关闭
    忽略(实时线程=真)
    文件“/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/coordinator.py”,第389行,在join中
    六、重新提升(*自我执行信息提升)
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/python/training/queue\u runner\u impl.py”,第238行,in\u run
    排队_callable()
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/python/client/session.py”,第1063行,在单次运行中
    目标\列表\作为\字符串,状态,无)
    文件“/usr/lib/python2.7/contextlib.py”,第24行,在__
    self.gen.next()
    文件“/usr/local/lib/python2.7/dist packages/tensorflow/python/framework/errors\u impl.py”,第466行,处于raise\u exception\u on\u not\u ok\u状态
    pywrap_tensorflow.TF_GetCode(状态))
    PermissionDeniedError:执行HTTP请求时出错(HTTP响应代码403,错误代码0,错误消息“”)
    读取pasquinelli bigdata的模式时:Transactions.t_11_Hotel_25_w_train@1505224768418
    [[节点:GenerateBigQueryReaderPartions=GenerateBigQueryReaderPartions[列=[“F_Rac_GEST”,“LABEL”,“F_RCA”,“W24”,“ETA”,“W22”,“W23”,“W20”,“W21”,“F_LEASING”,“W2”,“W16”,“WLABEL”,“SEX”,“F_PIVA”,“F_MUTUO”,“Id_client”,“F_Pass_VITA”,“F_DANNI”,“F_DANNI”,“W19”,“W18”,“W17”,“PROV”,“W15”,“W14”,“W13”,“2”,“W11”,“W11”,“W7”,“W16””W5、W4、W3、F_FIN、W1、ImpTot、F_MULTIB、W9、W8、、dataset_id=“Transactions”、num_partitions=1、project_id=“pasquinelli bigdata”、table_id=“t_11_Hotel_25_w_train”、test_end_point=“、timestamp_millis=150522476848;=150; device=“/job:localhost/replica:0/task:0/cpu:0”]()]
    
    任何建议都会非常有用,因为我是GC的新手。
    谢谢大家。

    对从云ML引擎读取BigQuery数据的支持仍在开发中,因此当前不支持您所做的操作。您遇到的问题是,ML引擎运行的计算机没有与BigQuery对话的正确作用域。在本地运行时可能遇到的一个潜在问题是读取性能差这是两个需要解决的工作示例


    同时,我建议将数据导出到地面军事系统进行训练。这将具有更大的可扩展性,因此您不必担心数据增加时训练性能不佳。这是一种很好的模式,也可以让您对数据进行一次预处理,将结果以CSV格式写入地面军事系统,然后对数据进行多次训练尝试不同的算法或超参数。

    非常感谢您的回答。我将按照您的建议继续。
    File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
        "__main__", fname, loader, pkg_name)
    
    File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
        exec code in run_globals
    
    File "/root/.local/lib/python2.7/site-packages/trainer/task.py", line 131, in <module>
        hparams=hparam.HParams(**args.__dict__)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/learn_runner.py", line 210, in run
        return _execute_schedule(experiment, schedule)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/learn_runner.py", line 47, in _execute_schedule
        return task()
    
     File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/experiment.py", line 495, in train_and_evaluate
        self.train(delay_secs=0)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/experiment.py", line 275, in train
        hooks=self._train_monitors + extra_hooks)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/experiment.py", line 665, in _call_train
        monitors=hooks)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/util/deprecation.py", line 289, in new_func
        return func(*args, **kwargs)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 455, in fit
        loss = self._train_model(input_fn=input_fn, hooks=hooks)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 1007, in _train_model
        _, loss = mon_sess.run([model_fn_ops.train_op, model_fn_ops.loss])
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/monitored_session.py", line 521, in __exit__
        self._close_internal(exception_type)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/monitored_session.py", line 556, in _close_internal
        self._sess.close()
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/monitored_session.py", line 791, in close
        self._sess.close()
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/monitored_session.py", line 888, in close
        ignore_live_threads=True)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/coordinator.py", line 389, in join
        six.reraise(*self._exc_info_to_raise)
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/queue_runner_impl.py", line 238, in _run
        enqueue_callable()
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/client/session.py", line 1063, in _single_operation_run
        target_list_as_strings, status, None)
    
    File "/usr/lib/python2.7/contextlib.py", line 24, in __exit__
        self.gen.next()
    
    File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/errors_impl.py", line 466, in raise_exception_on_not_ok_status
        pywrap_tensorflow.TF_GetCode(status))
    PermissionDeniedError: Error executing an HTTP request (HTTP response code 403, error code 0, error message '')
         when reading schema for pasquinelli-bigdata:Transactions.t_11_Hotel_25_w_train@1505224768418
         [[Node: GenerateBigQueryReaderPartitions = GenerateBigQueryReaderPartitions[columns=["F_RACC_GEST", "LABEL", "F_RCA", "W24", "ETA", "W22", "W23", "W20", "W21", "F_LEASING", "W2", "W16", "WLABEL", "SEX", "F_PIVA", "F_MUTUO", "Id_client", "F_ASS_VITA", "F_ASS_DANNI", "W19", "W18", "W17", "PROV", "W15", "W14", "W13", "W12", "W11", "W10", "W7", "W6", "W5", "W4", "W3", "F_FIN", "W1", "ImpTot", "F_MULTIB", "W9", "W8"], dataset_id="Transactions", num_partitions=1, project_id="pasquinelli-bigdata", table_id="t_11_Hotel_25_w_train", test_end_point="", timestamp_millis=1505224768418, _device="/job:localhost/replica:0/task:0/cpu:0"]()]]