Python tensorflow 1.8版中tf.contrib.seq2seq.prepare_attention()的替代方案

Python tensorflow 1.8版中tf.contrib.seq2seq.prepare_attention()的替代方案,python,tensorflow,deep-learning,chatbot,seq2seq,Python,Tensorflow,Deep Learning,Chatbot,Seq2seq,AttributeError:模块“tensorflow.contrib.seq2seq”没有属性“prepare\u attention” 我知道prepare_attention()已被弃用。有什么替代方案?同时请指定语法 我使用的函数是: def解码层序列(编码器状态、dec单元、dec嵌入输入、序列长度、解码范围、, 输出\u fn、保留\u prob、批次大小): “解码训练数据” attention_states = tf.zeros([batch_size, 1, dec_cell

AttributeError:模块“tensorflow.contrib.seq2seq”没有属性“prepare\u attention”

我知道prepare_attention()已被弃用。有什么替代方案?同时请指定语法

我使用的函数是: def解码层序列(编码器状态、dec单元、dec嵌入输入、序列长度、解码范围、, 输出\u fn、保留\u prob、批次大小): “解码训练数据”

attention_states = tf.zeros([batch_size, 1, dec_cell.output_size])

att_keys, att_vals, att_score_fn, att_construct_fn = tf.contrib.seq2seq.prepare_attention(attention_states,
                                             attention_option="bahdanau",
                                             num_units=dec_cell.output_size)

train_decoder_fn = tf.contrib.seq2seq.attention_decoder_fn_train(encoder_state[0],
                                                                 att_keys,
                                                                 att_vals,
                                                                 att_score_fn,
                                                                 att_construct_fn,
                                                                 name = "attn_dec_train")
train_pred, _, _ = tf.contrib.seq2seq.dynamic_rnn_decoder(dec_cell,
                                                          train_decoder_fn,
                                                          dec_embed_input,
                                                          sequence_length,
                                                          scope=decoding_scope)
train_pred_drop = tf.nn.dropout(train_pred, keep_prob)
return output_fn(train_pred_drop)