Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/google-cloud-platform/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Google cloud platform 使用API密钥验证对Google ML引擎的API调用_Google Cloud Platform_Google Authentication_Google Ai Platform - Fatal编程技术网

Google cloud platform 使用API密钥验证对Google ML引擎的API调用

Google cloud platform 使用API密钥验证对Google ML引擎的API调用,google-cloud-platform,google-authentication,google-ai-platform,Google Cloud Platform,Google Authentication,Google Ai Platform,我在谷歌人工智能平台中保存了一个模型,当我在人工智能平台UI中测试预测时,该模型可以工作 然而,当我尝试通过REST访问API时,我总是得到一个状态为401的响应。我想知道如何成功地做到这一点 我的api URL如下所示: 'https://ml.googleapis.com/v1/projects/ml-project-name/models/my-model-names/versions/v2:predict 我希望能够在驻留在任何平台上的外部应用程序中访问此端点,以便使用它生成预测 谷歌云

我在谷歌人工智能平台中保存了一个模型,当我在人工智能平台UI中测试预测时,该模型可以工作

然而,当我尝试通过REST访问API时,我总是得到一个状态为
401
的响应。我想知道如何成功地做到这一点

我的api URL如下所示:

'https://ml.googleapis.com/v1/projects/ml-project-name/models/my-model-names/versions/v2:predict

我希望能够在驻留在任何平台上的外部应用程序中访问此端点,以便使用它生成预测

谷歌云建议服务账户授权,但是,所有的方向都需要设置环境变量,这样应用程序才能自动对你进行身份验证。我更愿意在请求中直接提供它们,使它们更便于携带,并与工作中其他地方的做法一致

因此,我尝试获取API密钥

根据此页面:您可以通过以下方式对请求进行身份验证:

POSThttps://language.googleapis.com/v1/documents:analyzeEntities?key=API_KEY

但是,当我运行以下代码时,我的请求状态为
401

import requests

api_key = my_sample_api_key
url     = f'https://ml.googleapis.com/v1/projects/project-name/models/model-name/versions/v2:predict?key={api_key}'

json    = {"instances": [ {"input_1": ["Please predict this text"]}]}

res = request.post(url, json=json)

感谢您的帮助。

Auto ML不支持在发送请求时使用API密钥。我建议根据您的请求使用auth令牌,或者使用可用的客户端库发送预测

以下是使用its发送预测请求的代码段:

# Create the AI Platform service object.
# To authenticate set the environment variable
# GOOGLE_APPLICATION_CREDENTIALS=<path_to_service_account_file>
service = googleapiclient.discovery.build('ml', 'v1')

def predict_json(project, model, instances, version=None):
    """Send json data to a deployed model for prediction.

    Args:
        project (str): project where the AI Platform Model is deployed.
        model (str): model name.
        instances ([Mapping[str: Any]]): Keys should be the names of Tensors
            your deployed model expects as inputs. Values should be datatypes
            convertible to Tensors, or (potentially nested) lists of datatypes
            convertible to tensors.
        version: str, version of the model to target.
    Returns:
        Mapping[str: any]: dictionary of prediction results defined by the
            model.
    """
    name = 'projects/{}/models/{}'.format(project, model)

    if version is not None:
        name += '/versions/{}'.format(version)

    response = service.projects().predict(
        name=name,
        body={'instances': instances}
    ).execute()

    if 'error' in response:
        raise RuntimeError(response['error'])

    return response['predictions']

Auto ML不支持在发送请求时使用API键。我建议根据您的请求使用auth令牌,或者使用可用的客户端库发送预测

以下是使用its发送预测请求的代码段:

# Create the AI Platform service object.
# To authenticate set the environment variable
# GOOGLE_APPLICATION_CREDENTIALS=<path_to_service_account_file>
service = googleapiclient.discovery.build('ml', 'v1')

def predict_json(project, model, instances, version=None):
    """Send json data to a deployed model for prediction.

    Args:
        project (str): project where the AI Platform Model is deployed.
        model (str): model name.
        instances ([Mapping[str: Any]]): Keys should be the names of Tensors
            your deployed model expects as inputs. Values should be datatypes
            convertible to Tensors, or (potentially nested) lists of datatypes
            convertible to tensors.
        version: str, version of the model to target.
    Returns:
        Mapping[str: any]: dictionary of prediction results defined by the
            model.
    """
    name = 'projects/{}/models/{}'.format(project, model)

    if version is not None:
        name += '/versions/{}'.format(version)

    response = service.projects().predict(
        name=name,
        body={'instances': instances}
    ).execute()

    if 'error' in response:
        raise RuntimeError(response['error'])

    return response['predictions']