Keras throws an error message on this trivial example. Need help. Also, is there a documentation on what the tensor dimensions it expects for Conv2D and Input? Spent too much time trying to find solutions and mutate/rotate the tensor every each way...
My specs: Windows 10 x64, Python 3.6 (from Anaconda 3 x64), Keras 2.09, TensorFlow 1.4.0
import numpy as np
from keras.models import Model
from keras.layers import Conv2D, Input
from keras.utils.np_utils import to_categorical
n_samples, n_row, n_col, n_channels = 1006, 99, 81, 1
tX = np.random.rand(n_samples, n_row, n_col, n_channels)
tY = np.random.randint(0,5,n_samples)
inp = Input(shape=(n_row, n_col, n_channels))
lr = Conv2D(16, kernel_size=2, padding='same')(inp)
M = Model(inputs=inp, outputs=lr)
M.compile(optimizer='Adam',loss='categorical_crossentropy')
M.fit(tX, to_categorical(tY, num_classes=None))
gives the error message:
Traceback (most recent call last):
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Anaconda3_64\lib\site-packages\IPython\core\interactiveshell.py", line 2862, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-11-27bd9e59639d>", line 14, in <module>
M.fit(tX, to_categorical(tY, num_classes=None))
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Anaconda3_64\lib\site-packages\keras\engine\training.py", line 1581, in fit
batch_size=batch_size)
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Anaconda3_64\lib\site-packages\keras\engine\training.py", line 1418, in _standardize_user_data
exception_prefix='target')
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Anaconda3_64\lib\site-packages\keras\engine\training.py", line 141, in _standardize_input_data
str(array.shape))
ValueError: Error when checking target: expected conv2d_3 to have 4 dimensions, but got array with shape (1006, 5)
categorical cross entropyin your loss function while you are generating only one label withty. That's why it's throwing this error. Either have a number of categories in your labels and have them one-hot encoded or change your loss function accordinglyto_categoricalfrom Keras, but to no avail. Still throws the same error. I've also updated all key packages to the latest (tensorflow, keras, ...)