I am trying to create a custom objective function in Keras (tensorflow backend) with an additional parameter whose value would depend on the batch being trained.
Eg:
def myLoss(self, stateValues):
def sparse_loss(y_true, y_pred):
foo = tf.nn.softmax_cross_entropy_with_logits(labels=y_true, logits=y_pred)
return tf.reduce_mean(foo * stateValues)
return sparse_loss
self.model.compile(loss=self.myLoss(stateValue = self.stateValue),
optimizer=Adam(lr=self.alpha))
My train function is as follows
for batch in batches:
self.stateValue = computeStateValueVectorForCurrentBatch(batch)
model.fit(xVals, yVals, batch_size=<num>)
However, the stateValue in the loss function is not being updated. It is just using the value stateValue has at model.compile step.
I guess this could be solved by using a placeHolder for stateValue but I am unable to figure out how to do it. Can someone please help?