In the normal code, I do something like this, and everything works fine:
from keras.layers import Input, Dense
from keras.models import Model
import keras.backend as K
import numpy as np
import tensorflow as tf
from sklearn.datasets import make_blobs
X, y = make_blobs(500,50,2)
def make_network1():
input_layer = Input((50,))
layer1 = Dense(100,name='network1_dense1')(input_layer)
output = Dense(50,name='network1_dense2')(layer1)
model = Model(input_layer,output)
return model
def make_network2():
input_layer = Input((50,))
layer1 = Dense(100,name='network2_dense1')(input_layer)
output = Dense(1,name='network2_output')(layer1)
model = Model(input_layer,output)
return model
network1 = make_network1()
network2 = make_network2()
output = network2(network1.output)
model = Model(network1.input, output)
Now, I want to experiment with the .get_layer method and .output attribute in Keras by replacing the last line of code with:
model = Model(network1.input, network2.get_layer('network2_output').output)
Then it gives me the following error:
Graph disconnected: cannot obtain value for tensor Tensor("input_4:0", shape=(?, 50), dtype=float32) at layer "input_4". The following previous layers were accessed without issue: []
My Question
However, shouldn't be output and network2.get_layer('network2_output').output the same thing? When I try to print both of them out, it says:
Tensor("model_14/network2_output/BiasAdd:0", shape=(?, 1), dtype=float32)
and
Tensor("network2_output_1/BiasAdd:0", shape=(?, 1), dtype=float32)
And network2 has been connected to the output of network1 already, I don't get why it is disconnected. How to make the code works with the .get_layer and .output methods?
I am using keras==2.24 and tensorflow-gpu==1.5.