Skip to content Skip to sidebar Skip to footer

Gradient Of Tf.floor Is None

tf.floor return None gradient it means the weights before floor operation won't be update, right? but I still need the gradient to update weights Isn't it weird? sometimes we use f

Solution 1:

TensorFlow uses None to represent 0 for implementation reasons. I don't think it would help you if the returned gradient was true gradient 0, since you want to train. You could use gradient_override_map to substitute gradient of Identity op for Floor as follows

tf.reset_default_graph()
x = tf.Variable(10.)
with tf.get_default_graph().gradient_override_map({"Floor": "Identity"}):
    x2 = tf.floor(x)
loss = tf.square(x2)
opt = tf.train.GradientDescentOptimizer(0.1)
train_op = opt.minimize(loss)
sess = tf.Session()
sess.run(tf.global_variables_initializer())
for i in range(10):
    print(sess.run([loss, train_op]))

Post a Comment for "Gradient Of Tf.floor Is None"