1

I've built the following TensorArray:

ta = tf.TensorArray(
    dtype=tf.float32,   
    size=0,
    dynamic_size=True,
    element_shape=tf.TensorShape([None, None])
)

and called ta = ta.write(idx, my_tensor) inside a while_loop.

When evaluating the output = ta.stack() tensor in a session, I receive this error message:

ValueError: Cannot use '.../TensorArrayWrite/TensorArrayWriteV3' as input to '.../TensorArrayStack_1/TensorArraySizeV3' because '.../TensorArrayWrite/TensorArrayWriteV3' is in a while loop. See info log for more details.

I don't understand this error message, could you please help me ?

Update: A minimal example might be difficult to come up with, but this is what I am doing: I am using the reference to the ta TensorArray inside the cell_input_fn of AttentionWrapper. This callback is used in AttentionWrapper's call method, where another TensorArray named alignment_history is being written. Therefore the while_loop code is not designed by me, it's part of the TF dynamic RNN computation tf.nn.dynamic_rnn.

2
  • 1
    Hard to tell without any more code. Can you please post a full minimal example of the issue? Commented Feb 11, 2019 at 14:08
  • @jdehesa I updated my post a few hours ago, please let me now if this is clear enough. Commented Feb 11, 2019 at 21:06

1 Answer 1

1

Not sure if this is what's biting you, but you have to make sure your while_loop function takes the tensor array as input and emits an updated one as output; and you have to use the final version of the TensorArray at the end of the while_loop:

def fn(ta_old):
  return ta_old.write(...)

ta_final = while_loop(..., body=fn, [tf.TensorArray(...)])

values = ta_final.stack()

specifically you should never access ta_old outside of fn().

Sign up to request clarification or add additional context in comments.

2 Comments

The error gets triggered by the size call in _GraphTensorArray, which comes from stack. Since you know well the AttentionWrapper code, would it be possible to make cell_input_fn use the same RNN while_loop and update a TensorArray at the same time with the alignment_history TensorArray ?
Based on your comment, I think that I understand: the while_loop inside dynamic_rnn takes the cell state, an AttentionWrapperState, as one of the loop variables. Hence my TensorArray referenced in cell_input_fn does not get added to the loop vars. Would need a better design.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.