0

I'm using Keras, and I want to make a layer that takes [a0, a1], [b0, b1, b2] as inputs and gives [a0*b0, a0*b1, a0*b2, a1*b0, a1*b1, a1*b2] as output. I tried to use Lambda, but I couldn't succeed. Here's my code:

import numpy as np
from keras.models import Input
from keras.layers import Lambda

def mix(A):
    reshaped = [np.reshape(A[m], (1,np.size(A[m]))) for m in range(len(A))]
    mixed = reshaped[-1]

    for i in range(len(A)-1):
        mixed = np.matmul(np.transpose(reshaped[-i-2]), mixed)
        mixed = np.reshape(mixed, (1,np.size(mixed)))

    return np.reshape(mixed, np.size(mixed))

a = Input(shape=(2,))
b = Input(shape=(3,))
c = Lambda(mix)([a, b])

Here's the error I got:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-32-07bbf930b48b> in <module>()
      1 a = Input(shape=(2,))
      2 b = Input(shape=(3,))
----> 3 c = Lambda(mix)([a, b])

~\Anaconda3\envs\mind\lib\site-packages\keras\engine\base_layer.py in __call__(self, inputs, **kwargs)
    455             # Actually call the layer,
    456             # collecting output(s), mask(s), and shape(s).
--> 457             output = self.call(inputs, **kwargs)
    458             output_mask = self.compute_mask(inputs, previous_mask)
    459 

~\Anaconda3\envs\mind\lib\site-packages\keras\layers\core.py in call(self, inputs, mask)
    685         if has_arg(self.function, 'mask'):
    686             arguments['mask'] = mask
--> 687         return self.function(inputs, **arguments)
    688 
    689     def compute_mask(self, inputs, mask=None):

<ipython-input-31-bbc21320d8af> in mix(A)
      4 
      5     for i in range(len(A)-1):
----> 6         mixed = np.matmul(np.transpose(reshaped[-i-2]), mixed)
      7         mixed = np.reshape(mixed, (1,np.size(mixed)))
      8 

TypeError: Object arrays are not currently supported

But if I put:

a = np.array([1,2])
b = np.array([3,4,5])
print(mix([a,b]))

then I get:

[ 3  4  5  6  8 10]

which is exactly what I intended. But I don't know how to put this in Lambda properly.

Can anyone tell me how to handle this? I'm new to Keras, so I don't know the internal structure of Lambda, Input or other stuffs.


Following Abhijit's comment, I changed the code like this:

import numpy as np
import tensorflow as tf
from keras.models import Input
from keras.layers import Lambda

def mix(A):
    reshaped = [tf.reshape(A[m], (1,tf.size(A[m]))) for m in range(len(A))]
    mixed = reshaped[-1]

    for i in range(len(A)-1):
        mixed = tf.matmul(tf.transpose(reshaped[-i-2]), mixed)
        mixed = tf.reshape(mixed, (1,tf.size(mixed)))

    return tf.reshape(mixed, [tf.size(mixed)])

a = Input(shape=(2,))
b = Input(shape=(3,))
c = Lambda(mix)([a, b])

Now I don't get any errors, but I don't think I got the right neural network. Because executing:

model = Model(inputs=[a,b], outputs=c)
print(model.summary())

I get:

__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_22 (InputLayer)           (None, 2)            0                                            
__________________________________________________________________________________________________
input_23 (InputLayer)           (None, 3)            0                                            
__________________________________________________________________________________________________
lambda_3 (Lambda)               (None,)              0           input_22[0][0]                   
                                                                 input_23[0][0]                   
==================================================================================================
Total params: 0
Trainable params: 0
Non-trainable params: 0
__________________________________________________________________________________________________

But see the layer lambda_3. Shouldn't the output shape be (None, 6)?

2
  • Numpy arrays don't support auto differentiation. Your lambda function must use tensorflow operations, whose derivatives are automatically calculated during backprop. Since your code is straightforward, use tf instead of np and it will work. Commented Nov 5, 2018 at 1:54
  • @AbhijitBalaji I followed your comment and got no error. But I think I have another problem. I edited my question. Can you please take a look? Commented Nov 5, 2018 at 3:12

1 Answer 1

1

Apart from the fact that you need to use Keras backend functions (i.e. keras.backend.*) or use backend functions directly (i.e. tf.* or th.*), I think you are making the definition of mix unnecessarily complicated. It can be done much simpler like this:

from keras import backend as K

def mix(ts):
    t0 = K.expand_dims(ts[0], axis=-1)
    t1 = K.expand_dims(ts[1], axis=1)
    return K.batch_flatten(t0 * t1)

a = Input(shape=(2,))
b = Input(shape=(3,))
c = Lambda(mix)([a, b])

model = Model(inputs=[a,b], outputs=c)

Here is the test:

# the reshapes are necessary to make them a batch
a = np.array([1,2]).reshape(1,2)
b = np.array([3,4,5]).reshape(1,3)
print(model.predict([a, b]))

# output
[[ 3.  4.  5.  6.  8. 10.]]

Further, sometimes the Lambda layer could automatically infer the output shape. However, if you would like you can explicitly set its output shape:

c = Lambda(mix, output_shape=(6,))([a, b])

Model summary:

Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_9 (InputLayer)            (None, 2)            0                                            
__________________________________________________________________________________________________
input_10 (InputLayer)           (None, 3)            0                                            
__________________________________________________________________________________________________
lambda_5 (Lambda)               (None, 6)            0           input_9[0][0]                    
                                                                 input_10[0][0]                   
==================================================================================================
Total params: 0
Trainable params: 0
Non-trainable params: 0
__________________________________________________________________________________________________
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.