3

I am building neural network using denoising stacked autoencoders. I train autoencoder and then I would like to take the matrix of weights W and copy/initialize/clone it's values into new variable which is used in supervised optimization. How can I do such thing?

.initialized_value() doesn't work for me :/

1
  • I must have error somewhere else, because this still doesn't work :/ Commented Jan 25, 2016 at 0:45

1 Answer 1

3

Use var.assign, ie

vara = tf.Variable(0)
varb = tf.Variable(0)
init_op = tf.initialize_all_variables()
sess = tf.InteractiveSession()
sess.run([init_op])
sess.run([vara.assign_add(1)])
print 'variable a', vara.eval()
print 'variable b', varb.eval()
sess.run([varb.assign(vara)])
print 'variable b', varb.eval()

You should see

variable a 1
variable b 0
variable b 1
Sign up to request clarification or add additional context in comments.

5 Comments

Consider varA is created from different model (in this case autoencoder), so: varB = tf.Variable([size]). varB.assign(varA) init = tf.initialize_all_variables() session = tf.Session() session.run(init) ... this does nothing. varA is trained as: tf.Variable(tf.zeros([self.n_hidden], dtype=tf.float32))
sess.run([varB.assign(varA) ]) will copy value of varA to varB. Could it be that you have two different versions of "varA"? You need some way to pass Python variable object from the place where it was created to the place where it's getting used.
As a sanity check, try doing "print varA.name", that gives a unique identifier that id's the node in the graph
may I contact you with more details, please?
ok this works. My problem was different. Thank you for clarification

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.