So, I'm not entirely sure what's going on here, but for whatever reason Python is throwing this at me. For reference, it's part of a small neural network I'm building for fun, but it uses a lot of np.array and such, so there's a lot of matrices being thrown around, so I think it's creating some sort of data type clash. Maybe somebody can help me figure this out, because I've been staring at this error for too long without being able to fix it.
#cross-entropy error
#y is a vector of size N and output is an Nx3 array
def CalculateError(self, output, y):
#calculate total error against the vector y for the neurons where output = 1 (the rest are 0)
totalError = 0
for i in range(0,len(y)):
totalError += -np.log(output[i, int(y[i])]) #error is thrown here
#now account for regularizer
totalError+=(self.regLambda/self.inputDim) * (np.sum(np.square(self.W1))+np.sum(np.square(self.W2)))
error=totalError/len(y) #divide ny N
return error
EDIT: Here's the function that returns the output so you know where that came from. y is a vector of length 150 that is taken directly from a text document. at each index of y it contains an index either 1,2, or 3:
#forward propogation algorithm takes a matrix "X" of size 150 x 3
def ForProp(self, X):
#signal vector for hidden layer
#tanh activation function
S1 = X.dot(self.W1) + self.b1
Z1 = np.tanh(S1)
#vector for the final output layer
S2 = Z1.dot(self.W2)+ self.b2
#softmax for output layer activation
expScores = np.exp(S2)
output = expScores/(np.sum(expScores, axis=1, keepdims=True))
return output,Z1
outputisn't actually an Nx4 array like you think it is.ForProp? it returns a tuple, probably you are passing what is produced, which is tuple consisting ofoutputandZ1, I guess you have something likeoutput = ForProp( .. )while it should beoutput, Z1 = ForProp( ... )