4

I'm trying to run my code using TensorFlow.

init = tf.global_variables_initializer()

loaded_graph = tf.Graph()
saver = tf.train.Saver()

with tf.Session(loaded_graph) as sess:
    sess.run(init)
    ...

But I got this error.

  File "C:\Users\K451LN\My Documents\LiClipse Workspace\neuralnet\FFNN.py", line 68, in <module>
with tf.Session(loaded_graph) as sess:
AttributeError: 'Session' object has no attribute '_session'

Is there anything wrong with the tf.Graph()?

Here is my code:

for i in range(num_networks):
     print("Neural network: {0}".format(i))

     X = tf.placeholder(tf.float32)
     Y = tf.placeholder(tf.float32)

     W1 = tf.Variable(tf.random_uniform([n_input, n_hidden], -1.0, 1.0), name = 'W1')
     W2 = tf.Variable(tf.random_uniform([n_hidden, n_output], -1.0, 1.0), name = 'W2')

     b1 = tf.Variable(tf.zeros([n_hidden]), name="Bias1")
     b2 = tf.Variable(tf.zeros([n_output]), name="Bias2")

     L2 = tf.sigmoid(tf.matmul(X, W1) + b1)
     hy = tf.sigmoid(tf.matmul(L2, W2) + b2, name="op_to_restore")

     cost = tf.reduce_mean(-Y*tf.log(hy) - (1-Y)*tf.log(1-hy))
     optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)

    init = tf.global_variables_initializer()

    loaded_graph = tf.Graph()
    saver = tf.train.Saver()

    with tf.Session(loaded_graph) as sess:
        sess.run(init)
    ...

I'm adding thistf.Graph() to solve the error of ValueError: At least two variables have the same name: Bias2.

1 Answer 1

1

Passing loaded_graph to the tf.Session() means you can only run ops created in that graph. As all you do is create a graph called loaded_graph but dont add anything to it then you get this error when trying to do sess.run(init) as init op is not in loaded_graph's graph.

I guess the reason for your original error with Bias2 is the for loop. If you remove the for loop and don't create/pass loaded_graph you wont have any errors.

If you wish to have the for loop then you may need to on each loop create a new graph using

g_1 = tf.Graph() with g_1.as_default(): ...

so your code will be like:

for i in range(num_networks):
     g_1 = tf.Graph()
     with g_1.as_default():

         print("Neural network: {0}".format(i))

         X = tf.placeholder(tf.float32)
         Y = tf.placeholder(tf.float32)

         W1 = tf.Variable(tf.random_uniform([n_input, n_hidden], -1.0, 1.0), name = 'W1')
         W2 = tf.Variable(tf.random_uniform([n_hidden, n_output], -1.0, 1.0), name = 'W2')

         b1 = tf.Variable(tf.zeros([n_hidden]), name="Bias1")
         b2 = tf.Variable(tf.zeros([n_output]), name="Bias2")

         L2 = tf.sigmoid(tf.matmul(X, W1) + b1)
         hy = tf.sigmoid(tf.matmul(L2, W2) + b2, name="op_to_restore")

         cost = tf.reduce_mean(-Y*tf.log(hy) - (1-Y)*tf.log(1-hy))
         optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)

         init = tf.global_variables_initializer()


         saver = tf.train.Saver()

         with tf.Session(graph=g_1) as sess:
             sess.run(init)
             ...
Sign up to request clarification or add additional context in comments.

2 Comments

FYI, the for loop is added because i wish to create an ensemble NN. I've try your suggestion with g_1.as_default():, but seems doesn't work for me too. Any other suggestions?
Sorry forgot you need to put tf.Session(graph = g_1) as first argument to tf.Session() is target and not graph. Updated my answer.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.