Chris Gorman Chris Gorman - 2 months ago 44
Python Question

Hard limiting / threshold activation function in TensorFlow

I'm trying to implement a basic, binary Hopfield Network in TensorFlow 0.9. Unfortunately I'm having a very hard time getting the activation function working. I'm looking to get the very simple

If net[i] < 0, output[i] = 0, else output[i] = 1
but everything I've tried seems to remove the gradient, i.e. I get the "No gradients provided for any variable" exception when trying to implement the training op.

For example, I tried casting
tf.less()
to
float
, I tried doing something along the lines of

tf.maximum(tf.minimum(net, 0) + 1, 0)


but I forgot about small decimal values. Finally I did

tf.maximum(tf.floor(tf.minimum(net, 0) + 1), 0)


but
tf.floor
doesn't register gradients. I also tried replacing the floor with a cast to int and then a cast back to float but same deal.

Any suggestions on what I could do?

Answer Source

a bit late, but if anyone needs it, I used this definition

def binary_activation(x):

    cond = tf.less(x, tf.zeros(tf.shape(x)))
    out = tf.select(cond, tf.zeros(tf.shape(x)), tf.ones(tf.shape(x)))

    return out

with x being a tensor