I am generating a 1D Convolution and have some trouble with the input shape of my data. I had a look at some posts and it seems the error was that the data has to be 3D but my data is already 3D.
# shape
# x_train shape: (1228, 1452, 20)
# y_train shape: (1228, 1452, 8)
# x_val shape: (223, 680, 20)
# x_val shape: (223, 680, 8)
###
n_outputs = 8
n_timesteps = 1452
n_features = 20
model = Sequential()
model.add(Conv1D(filters=64, kernel_size=3, activation='relu', input_shape=(x_train.shape[1:]))) # ie 1452, 20
model.add(Conv1D(filters=64, kernel_size=3, activation='relu'))
model.add(Dropout(0.5))
model.add(MaxPooling1D(pool_size=2))
model.add(Flatten())
model.add(Dense(100, activation='relu'))
model.add(Dense(n_outputs, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(x_train, y_train, epochs=9,
batch_size=64,
shuffle=True)
But I keep on getting this error message:
ValueError: A target array with shape (1228, 1452, 8) was passed for an output of shape (None, 8) while using as loss `categorical_crossentropy`. This loss expects targets to have the same shape as the output.
What I gather from this is that target shape which is 3 dimensional is not the same as the 2 dimensional output and so it can't work out the loss, but I just need to find a way to reshape so that they are equal.
EDIT
model.summary() is shown below
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv1d (Conv1D) (None, 1450, 64) 3904
_________________________________________________________________
conv1d_1 (Conv1D) (None, 1448, 64) 12352
_________________________________________________________________
dropout (Dropout) (None, 1448, 64) 0
_________________________________________________________________
max_pooling1d (MaxPooling1D) (None, 724, 64) 0
_________________________________________________________________
flatten (Flatten) (None, 46336) 0
_________________________________________________________________
dense (Dense) (None, 100) 4633700
_________________________________________________________________
dense_1 (Dense) (None, 8) 808
=================================================================
Total params: 4,650,764
Traceback (most recent call last):
Trainable params: 4,650,764
Non-trainable params: 0
input_shapemay need another comma to make the batch size ambiguous. Perhapsinput_shape=(1452,20,)<-- notice the extra commayvalues have an unexpected dimension. Notice thaty_train shape: 1228,1452,8but down at your last Dense layer you've gotn_outputs=8. What's actually hitting your output is(batch,1452,8)but it's expecting(batch,8)yvalues should be (only) your one-hot encoded vectors of length 8. The network will only welcome inputs of shape (batch,1452,20) and only outputs shape (batch,8).y_trainis (batch,1452,8) but needs to be (batch,8). Also, your training and validation shapes don't agree: (batch,1452,20) for training and (batch,680,20) for validation, but (1452 != 680).