1

If the TensorFlow Keras code is written in functional manner

input_positive = Input(shape=(input_size,), name="input_query")
x_positive = tf.keras.layers.Dense(128, activation="relu")(input_positive)
x_positive = tf.keras.layers.Dense(128, activation=None)(x_positive),  # No activation on final dense layer
output = tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1))(x_positive)  # L2 normalize embeddings
self.model = Model(inputs=input_positive, outputs=output)

then the output's shape of model is (1, None, 128). But if written sequentially:

self. model = tf.keras.Sequential([
    tf.keras.layers.Dense(128, activation="relu", input_shape=(input_size,)),
    tf.keras.layers.Dense(128, activation=None),  # No activation on final dense layer
    tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1))  # L2 normalize embeddings
])

the output's shape of model is (None, 128). Is there something I am missing?

1
  • 1
    It's a typo... remove the comma after (x_positive) and before output layer in the functional format Commented Mar 4, 2021 at 13:59

1 Answer 1

1

I was able replicate your issue using sample input in functional model

import tensorflow as tf

input_positive = tf.keras.Input(shape=(1,), name="input_query")
x_positive = tf.keras.layers.Dense(128, activation="relu")(input_positive)
x_positive = tf.keras.layers.Dense(128, activation=None)(x_positive),  # No activation on final dense layer
output = tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1))(x_positive)  # L2 normalize embeddings

model = tf.keras.Model(inputs=input_positive, outputs=output)
model.summary()

Output:

Model: "model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_query (InputLayer)     [(None, 1)]               0         
_________________________________________________________________
dense (Dense)                (None, 128)               256       
_________________________________________________________________
dense_1 (Dense)              (None, 128)               16512     
_________________________________________________________________
lambda (Lambda)              (1, None, 128)            0         
=================================================================
Total params: 16,768
Trainable params: 16,768
Non-trainable params: 0

Fixed code:

As rightly mentioned by @Marco Cerliani. You should remove comma at the end of the second dense layer (i.e, right after (x_positive)) or before the lambda layer (i.e, output).

input_positive = tf.keras.Input(shape=(1,), name="input_query")
x_positive = tf.keras.layers.Dense(128, activation="relu")(input_positive)
x_positive = tf.keras.layers.Dense(128, activation=None)(x_positive)  # No activation on final dense layer
output = tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1))(x_positive)  # L2 normalize embeddings

model = tf.keras.Model(inputs=input_positive, outputs=output)
model.summary()

Output:

Model: "model_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_query (InputLayer)     [(None, 1)]               0         
_________________________________________________________________
dense_2 (Dense)              (None, 128)               256       
_________________________________________________________________
dense_3 (Dense)              (None, 128)               16512     
_________________________________________________________________
lambda_1 (Lambda)            (None, 128)               0         
=================================================================
Total params: 16,768
Trainable params: 16,768
Non-trainable params: 0
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.