0

I've made a new op and I'd like to use it with AdamOptimizer. I've created a gradient for it following the instructions here and added it to my optimizer's var_list but Tensorflow says that my variable doesn't have a processor.

Is there support for Tensorflow custom ops in optimizers? Does the optimizer class let me create a new processor or would I have to rewrite part of compute_gradients?

Also, what does automatic differentiation mean, as stated by the TF docs:

To make automatic differentiation work for new ops, you must register a gradient function which computes gradients with respect to the ops' inputs given gradients with respect to the ops' outputs.

Thanks!

2
  • 1
    Do you mean adding a custom op and registering the gradient? The statement just means that we infer the gradient by applying the chain rule. User ops are slightly different from regular ops. A good simple example should be the sigmoid. Commented Jul 26, 2017 at 18:00
  • No, I was doing something different. Thanks for the clarification on auto differentiation! Commented Aug 14, 2017 at 20:07

1 Answer 1

0

So I found out that what I was doing was not supported with Tensorflow optimizer.

I was trying to create an op that would act like a Tensorflow variable (i.e. get updated by the functions within Optimizer::minimize()), however, I believe that TF does something weird with processors and Eigen::Tensors that I don't fully understand in order to update gradients with minimize(), and naturally this doesn't work with Op classes.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.