Python 如何在TensorFlow模式中使用复变量?

Python 如何在TensorFlow模式中使用复变量?,python,tensorflow,eager-execution,Python,Tensorflow,Eager Execution,在非紧急模式下,我可以运行此操作而不会出现问题: s = tf.complex(tf.Variable(1.0), tf.Variable(1.0)) train_op = tf.train.AdamOptimizer(0.01).minimize(tf.abs(s)) with tf.Session() as sess: sess.run(tf.global_variables_initializer()) for i in range(5): _, s_ =

在非紧急模式下,我可以运行此操作而不会出现问题:

s = tf.complex(tf.Variable(1.0), tf.Variable(1.0))
train_op = tf.train.AdamOptimizer(0.01).minimize(tf.abs(s))

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    for i in range(5):
        _, s_ = sess.run([train_op, s])
        print(s_)

>(1+1j)
(0.99+0.99j)
(0.98+0.98j)
(0.9700001+0.9700001j)
(0.9600001+0.9600001j)
但我似乎找不到渴望模式中的等价表达。我尝试了以下方法,但TF抱怨:

tfe = tf.contrib.eager
s = tf.complex(tfe.Variable(1.0), tfe.Variable(1.0))
def obj(s):
    return tf.abs(s)
with tf.GradientTape() as tape:
    loss = obj(s)
    grads = tape.gradient(loss, [s])
    optimizer.apply_gradients(zip(grads, [s]))
调用GradientTape.gradient时,源张量的数据类型必须是浮动的(例如
tf.float32
),got
tf.complex64

没有为任何变量提供梯度:
['tf.Tensor((1+1j),shape=(),dtype=complex64)]


如何在急切模式下训练复变量?

使用Tensorflow 2中的急切模式,可以将实部和虚部作为实变量:

r, i = tf.Variable(1.0), tf.Variable(1.0)
def obj(s):
    return tf.abs(s)
with tf.GradientTape() as tape:
    s = tf.complex(r, i)
    loss = obj(s)
    grads = tape.gradient(loss, [r, i])
    optimizer.apply_gradients(zip(grads, [r, i]))

使用Tensorflow 2中的“渴望”模式,可以将实部和虚部作为实变量:

r, i = tf.Variable(1.0), tf.Variable(1.0)
def obj(s):
    return tf.abs(s)
with tf.GradientTape() as tape:
    s = tf.complex(r, i)
    loss = obj(s)
    grads = tape.gradient(loss, [r, i])
    optimizer.apply_gradients(zip(grads, [r, i]))