0

I try to use an already trained neuronal network but always stumble upon this error. My input layer has a size of 70 numeric elements and as such the input data xData out of "last.cvs" come with same size:

[0.36 0.44 0.7  0.82 0.72 0.06 0.08 0.32 0.84 0.62 0.08 0.42 0.12 0.08
 0.6  0.48 0.52 0.08 0.28 0.2  0.18 0.4  0.68 0.98 0.32 0.06 0.2  0.04
 0.76 0.62 0.48 0.8  0.38 0.2  0.14 0.5  0.06 0.64 0.2  0.86 0.06 0.02
 0.98 0.7  0.12 0.78 0.24 0.18 0.08 0.04 0.18 0.72 0.94 0.46 0.18 0.04
 0.48 0.7  0.56 0.96 0.5  0.16 0.08 0.12 0.9  0.94 0.76 0.58 0.04 0.06]

My program itself looks like this:

import keras
from keras.models import load_model
from numpy import loadtxt, savetxt, reshape
import datetime as dt
import numpy as np
import os

xData= loadtxt('../input/last.csv')
print(xData)

model = load_model("model.dat")
prediction=model.predict(np.array(xData))
print(prediction);

So...any idea why it thinks the input size is 1 and not 70?

1 Answer 1

1

In keras the first dimension of a Tensor fed into a model must be the batch size : (batch_size, dim), so here your model understand that your are feeding 70 elements of size one.

Try to take your input vector and expand its dim, like that :

xData = np.expand_dims(xData, 0)

And then feed it into your network

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.