3

I am trying to calculate a 2D array of a function of variables x, y as a tf.function. The function is fairly complicated and i want to make a 2d array of this function where x and y take a list of values (tf.linspace). Ive tried inputing the relevant arguments for such a function, here is what it looks like

@tf.function
def function_matrix(xi, xf, yi, yf, num , some_other_args):
    
    #part1
    M=np.zeros((num, num))
    xlist=tf.linspace(xi, xf, num)
    ylist=tf.linspace(yi, yf, num)
    
    #part2
    for x in range(num):
         for y in range(num):
             M[x,y]=some_complicated_function(xlist[x], ylist[y], some_other_args)     #this is also a @tf.function
    
    return (M)

The problem I'm encountering is that within a tf.function, if I try to access elements of an array like xlist[x], the result is a Tensor("strided_slice:0", shape=(), dtype=float64). So when passing this value in some_complicated_function, I get an error "setting an array element with a sequence". No such error occurs if function_matrix is not a tf.function. Could someone help with this? as to where I could be going wrong? Or any alternative way I could calculate the 2D matrix of a fairly complicated function? Any help would be appreciated, Thanks!

What I've tried: Part 1 runs fine, If I return xlist as the output of a function, I get a normal array, tf.Tensor( [the_array_here], shape=(num,), dtype=float64). Similarly if the the output is xlist[index], I get tf.Tensor( [the_element_here], shape=(), dtype=float64). But is I try to print xlist[index] from within the function, I get Tensor("strided_slice:0", shape=(), dtype=float64). So I am concluding that somehow tf is treating xlist[index] as placeholder of somekind. But I dont know why...

0

1 Answer 1

3

Ooh nice question! tensorflow really doesn't like the for loops, it's python code that can't be automatically converted to a tensorflow graph representation. The way to implement this, is to generate the grid you want to operate on in a tensor. Let's say:

xlist=[1,2] # this is a tf.Tensor
ylist=[1,2] # this is a tf.Tensor

then, using tf.meshgrid, you should construct xylist:

xylist=[[1,1], [1,2], [2,1], [2,2]] # this is a tf.Tensor

and then use tf.map_fn to apply your function to each pair.

M = tf.map_fn(some_complicated_function, xylist)
M = tf.reshape(M, (...))

Note that if some_complicated_function contains any non tensorflow code (or code that cannot be automatically converted), like using numpy, pandas, pillow..., you can wrap it in a tf.py_function - but now that that kind of defeats the purpose of converting your function into a tf.function. (EDIT: I see now you say: # this is also a tf.function, which means you don't have to wrap it in a tf.py_function)

You can also include the extra_args by appending to each pair in xylist (yes each pair, even though they are constant).

TL;DR: use tf.map_fn instead of nested for loops.

Sign up to request clarification or add additional context in comments.

3 Comments

Good answer. I would suggest using tf.meshgrid instead of stack and concatenate though.
Thanks! I knew it existed but forgot the name :D included it
Also, for extra_args, if they don't change during the iteration, currying functools.partial or a lambda seems appropriate.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.