1

Here is my model:

from keras.layers import Input, Embedding, Flatten
from keras.models import Model


n_teams = 10888

team_lookup = Embedding(input_dim=n_teams,
                        output_dim=1,
                        input_length=1,
                        name='Team-Strength')

teamid_in = Input(shape=(1,))

strength_lookup = team_lookup(teamid_in)

strength_lookup_flat = Flatten()(strength_lookup)

team_strength_model = Model(teamid_in, strength_lookup_flat, name='Team-Strength-Model')

team_in_1 = Input(shape=(1,), name='Team-1-In')

team_in_2 = Input(shape=(1,), name='Team-2-In')

home_in = Input(shape=(1,), name='Home-In')

team_1_strength = team_strength_model(team_in_1)

team_2_strength = team_strength_model(team_in_2)

out = Concatenate()([team_1_strength, team_2_strength, home_in])

out = Dense(1)(out)

When I am fitting the model with 10888 inputs and running summary I am getting total of 10892 parameters, please explain me:

1) Where do the 4 come from? and

2) If each of my outputs is 10888 why it counts only once?

Here is the summary of the model:

__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
Team-1-In (InputLayer)          (None, 1)            0                                            
__________________________________________________________________________________________________
Team-2-In (InputLayer)          (None, 1)            0                                            
__________________________________________________________________________________________________
Team-Strength (Model)           (None, 1)            10888       Team-1-In[0][0]                  
                                                                 Team-2-In[0][0]                  
__________________________________________________________________________________________________
Home-In (InputLayer)            (None, 1)            0                                            
__________________________________________________________________________________________________
concatenate_1 (Concatenate)     (None, 3)            0           Team-Strength[1][0]              
                                                                 Team-Strength[2][0]              
                                                                 Home-In[0][0]                    
__________________________________________________________________________________________________
dense_1 (Dense)                 (None, 1)            4           concatenate_1[0][0]              
==================================================================================================
Total params: 10,892
Trainable params: 10,892
Non-trainable params: 0
__________________________________________________________________________________________________

1 Answer 1

1

To answer your questions:

  1. 4 stems from the output_size * (input_size + 1) = number_parameters. From concatenate_1[0][0] you have 3 connections and 1 bias, hence 4.

  2. 10880 is the size of your embedding layer to which Team-1 and Team-2 are connected. It's the total "vocabulary" that is going to be used and has nothing to do with the output (which is the second parameter to the Embedding).

I hope it makes sense.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.