custom inference part.
TFLite should infer models based upon a
.lite model file. In this example, it should accept the model:
combined with this input:
and be able to print:
daisy 0.7361 dandelion 0.242222 tulips 0.0185161 roses 0.0031544 sunflowers 8.00981e-06
Note: here we use a picture for simplicity, but it should obviously infer right from live video too by pointing the camera to a flower...
- The goal of this repo is to help for the Nativescript implementation of MLKit's on-device custom inference, which docs can be found here.
- /tf_files directory contains the converted
retrained_graph.pbwhich is a reinforcement model of the classic MobileNet with 224psx224px images as input. The retraining aims at recognizing flowers like the one in the same directory: /tf_files/flower_photos/daisy/3475870145_685a19116d.jpg.
- In addition to that, this repo contains 2 sample native apps for /android & /iOS using a converted .lite model, which are nothing more than a combined version of the awesome Google's codelab on TFLite: here for Android and here for iOS
- This React Native standalone implementation of TFLite for Android & iOS react-native-tensorflow-lite seems to be good resource to check.
- Another React Native implemenation react-native-camera-tflite with its Medium blog post seems to give pretty good results too.