Tensorflow 如果控制变量在循环中存在依赖关系怎么办?

Tensorflow 如果控制变量在循环中存在依赖关系怎么办?,tensorflow,Tensorflow,看看这个,逻辑是: '''grads depend on total_loss''' grads = optimizer.compute_gradients( total_loss, variables_to_train, gate_gradients=gate_gradients, aggregation_method=aggregation_method, colocate_gradients_with_ops=colocate_gradients_w

看看这个,逻辑是:

'''grads depend on total_loss'''
grads = optimizer.compute_gradients(
    total_loss,
    variables_to_train,
    gate_gradients=gate_gradients,
    aggregation_method=aggregation_method,
    colocate_gradients_with_ops=colocate_gradients_with_ops)

'''grad_updates depend on grads, so it also depend on total_loss'''
grad_updates = optimizer.apply_gradients(grads, global_step=global_step)

'''but total_loss depend on grad_updates'''
train_op = control_flow_ops.with_dependencies([grad_updates], total_loss)
看看注释,总损失取决于梯度更新,梯度更新取决于总损失,会发生什么