I am trying to load a model that I trained myself on the tensorflow android app. I trained the model in Caffe and then converted it to Tensorflow, but I am sure that this is not the problem because I tested it using classify.py and it works.
I then serialize the model and put it in a .pb, I replace the tensorflow_inception_graph.pb with mine (and name it the same). I can build the app using bazel but when I install it on the phone and run it, it crashes instantly. I think the culprit is the following error:
F/native (26026): tensorflow_jni.cc:309 Error during inference: Invalid argument: No OpKernel was registered to support Op 'FIFOQueue' with these attrs
F/native (26026): [[Node: processed_queue = FIFOQueue[capacity=1, component_types=[DT_INT32, DT_FLOAT], container="", shapes=[[], [224,224,3]], shared_name=""]()]]
But I don't really know how to fix it.
Also the apk has the entire protobuf inside.
Thanks for the help.