1

I'm using a model where the tensorflow relu function is used for the activation of the hidden layers. So basically the model does this

h = tf.nn.relu(zw)

where zw are all the elements from the output from the previous layer times weights. According to the definition of relu of tensorflow it will return

max(zw,0)

so the max number between 0 and the value of each element of zw for each element of the tensor.

How can I apply my own relu function where I return the element zw if it is above 0 and the zw element times 0.1 if it is smaller than 0?

1 Answer 1

1

You could do something like this:

h = tf.where(zw < 0, 0.1 * zw, zw)

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.