I have a logistic regression model (lr) implemented in TensorFrlow. I use this model to generate predictions:
print s.run(preds, feed_dict = {x:X[:5]})
After that I try to change the model parameters in the following way:
lr.w = tf.assign(lr.w, np.random.uniform(size=(inp_dim, out_dim)))
lr.b = tf.assign(lr.b, np.random.uniform(size=(out_dim,)))
s.run([lr.w, lr.b])
After that I generate new predictions in the same way:
print s.run(preds, feed_dict = {x:X[:5]})
Surprisingly, I get the same values as before the changes of the model parameters. So, it looks like I did not manage to change the model parameters.
Does anyone know what I am doing wrong?
ADDED
I probably need to provide more details about my "architecture". This is my implementation of the logistic regression:
class logreg:
def __init__(self, inp_dim, out_dim, r = 1.0):
# initialize values of model parameters
w_val = np.random.uniform(-r, r, size = (inp_dim, out_dim))
b_val = np.random.uniform(-r, r, size = (out_dim,))
self.w = tf.Variable(w_val, tf.float64)
self.b = tf.Variable(b_val, tf.float64)
def get_model_graph(self, inp):
return tf.nn.softmax(tf.matmul(inp, self.w) + self.b)
I use an instance of this class to define the prediction method:
x = tf.placeholder(tf.float64, [None, inp_dim])
preds = lr.get_model_graph(x)
I try to "redefine" the predict function by changing values of lr.w and lr.b and it does not work (as I have described above).
However, I found out that the new values of the model parameters become visible after I redefine the predict function:
lr.w = tf.assign(lr.w, np.random.uniform(size=(inp_dim, out_dim)))
lr.b = tf.assign(lr.b, np.random.uniform(size=(out_dim,)))
s.run(lr.w)
s.run(lr.b)
preds = lr.get_model_graph(x)
Why is that? Isn't it the case that the computational graph for "preds" is bound ot lr.w and lr.b and to redefine the "preds" I just need to change the values of w and b?