I tried use Tensorflow Lite, but it has lots of limitations, it doesn't have batch normalization operation, and even with simple operations it gave a very strange result to the same data tested with Keras. It means with keras everything works, with tensorflow lite, the result is completely wrong. So I need something to execute the .pb file on Android.
1 Answer
You can use the TensorFlowInferenceInterface to make predictions using a .pb file. First, place the .pb file in your app's assets folder.
- In your build.gradle(Module: app) file, add the following dependency,
implementation 'org.tensorflow:tensorflow-android:1.11.0' - Then initialize TensorFlowInferenceInterface, if your model files name is "model.pb" then ,
TensorFlowInferenceInterface tensorFlowInferenceInterface = new TensorFlowInferenceInterface(context.getAssets() , "file:///android_asset/model.pb") ; tensorFlowInferenceInterface.feed( INPUT_NAME , inputs , 1, 28, 28);whereINPUT_NAMEis the name of your input layer.1 , 50are the input dimensions.tensorFlowInferenceInterface.run( new String[]{ OUTPUT_NAME } );whereOUTPUT_NAMEis the name of your output layer.float[] outputs = new float[ nuymber_of_classes ]; tensorFlowInferenceInterface.fetch( OUTPUT_NAME , outputs ) ;
outputs are the float values which are predicted from your model.
Here's the full code :
TensorFlowInferenceInterface tensorFlowInferenceInterface = new
TensorFlowInferenceInterface(context.getAssets() , "file:///android_asset/model.pb");
tensorFlowInferenceInterface.feed( INPUT_NAME , inputs , 1, 28, 28);
tensorFlowInferenceInterface.run( new String[]{ OUTPUT_NAME } );
float[] outputs = new float[ nuymber_of_classes ];
tensorFlowInferenceInterface.fetch( OUTPUT_NAME , outputs ) ;