Google cloud platform ';gcloud beta ai平台解释';LSTM模型三维输入阵列的误差分析
我有一个成功训练的3d输入keras模型,以下是模型摘要:Google cloud platform ';gcloud beta ai平台解释';LSTM模型三维输入阵列的误差分析,google-cloud-platform,gcloud,google-cloud-ml,Google Cloud Platform,Gcloud,Google Cloud Ml,我有一个成功训练的3d输入keras模型,以下是模型摘要: Model: "model" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= input_1
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 5, 1815)] 0
_________________________________________________________________
bidirectional (Bidirectional (None, 5, 64) 473088
_________________________________________________________________
bidirectional_1 (Bidirection (None, 5, 64) 24832
_________________________________________________________________
output (TimeDistributed) (None, 5, 25) 1625
=================================================================
Total params: 499,545
Trainable params: 499,545
Non-trainable params: 0
_________________________________________________________________
定义估计器并创建服务后,如下所示:
# Convert our Keras model to an estimator
keras_estimator = tf.keras.estimator.model_to_estimator(keras_model=model, model_dir='export')
# We need this serving input function to export our model in the next cell
serving_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(
{'input_1': model.input}
)
#export the model to bucket
export_path = keras_estimator.export_saved_model(
'gs://' + BUCKET_NAME + '/explanations',
serving_input_receiver_fn=serving_fn
).decode('utf-8')
print(export_path)
解释元数据定义定义并复制到所需目标,如下所示:
explanation_metadata = {
"inputs": {
"data": {
"input_tensor_name": "input_1:0",
"input_baselines": [np.mean(data_X, axis=0).tolist()],
"encoding": "bag_of_features",
"index_feature_mapping": feature_X.tolist()
}
},
"outputs": {
"duration": {
"output_tensor_name": "output/Reshape_1:0"
}
},
"framework": "tensorflow"
}
# Write the json to a local file
with open('explanation_metadata.json', 'w') as output_file:
json.dump(explanation_metadata, output_file)
!gsutil cp explanation_metadata.json $export_path
发布模型已创建且版本定义为:
# Create the model if it doesn't exist yet (you only need to run this once)
!gcloud ai-platform models create $MODEL --enable-logging --regions=us-central1
# Create the version with gcloud
explain_method = 'integrated-gradients'
!gcloud beta ai-platform versions create $VERSION \
--model $MODEL \
--origin $export_path \
--runtime-version 1.15 \
--framework TENSORFLOW \
--python-version 3.7 \
--machine-type n1-standard-4 \
--explanation-method $explain_method \
--num-integral-steps 25
在此步骤之前,一切正常,但现在当我创建并发送解释请求时:
prediction_json = {'input_1': data_X[:5].tolist()}
with open('diag-data.json', 'w') as outfile:
json.dump(prediction_json, outfile)
#Send the request to google cloud
!gcloud beta ai-platform explain --model $MODEL --json-instances='diag-data.json'
我得到以下错误:
{
"error": "Explainability failed with exception: <_InactiveRpcError of RPC that terminated with:\n\tstatus = StatusCode.INVALID_ARGUMENT\n\tdetails = \"transpose expects a vector of size 4. But input(1) is a vector of size 3\n\t [[{{node bidirectional/forward_lstm_1/transpose}}]]\"\n\tdebug_error_string = \"{\"created\":\"@1586068796.692241013\",\"description\":\"Error received from peer ipv4:10.7.252.78:8500\",\"file\":\"src/core/lib/surface/call.cc\",\"file_line\":1056,\"grpc_message\":\"transpose expects a vector of size 4. But input(1) is a vector of size 3\\n\\t [[{{node bidirectional/forward_lstm_1/transpose}}]]\",\"grpc_status\":3}\"\n>"
}
我现在的已经走到了死胡同!gcloud beta ai platform explain--model$model--json instances='diag-data.json'
,并从SO社区寻求急需的帮助
此外,为了便于实验,笔记本可以从中访问。从predict的工作描述中,您是否尝试对explain进行同样的操作?从您的帖子中不清楚您是否:
prediction_json = {'input_1': data_X[:5].reshape(-1,1815).tolist()}
with open('diag-data.json', 'w') as outfile:
json.dump(prediction_json, outfile)
#send the predict request
!gcloud beta ai-platform explain --model $MODEL --json-instances='diag-data.json'
prediction_json = {'input_1': data_X[:5].reshape(-1,1815).tolist()}
with open('diag-data.json', 'w') as outfile:
json.dump(prediction_json, outfile)
#send the predict request
!gcloud beta ai-platform explain --model $MODEL --json-instances='diag-data.json'