TFLite in OpenMV
Last updated
Last updated
TensorFlow is a free and open-source machine learning library. The TensorFlow Lite is a special feature and mainly designed for embedded devices like mobile. This uses a custom memory allocator for execution latency and minimum load. It is also explaining the new file format supported Flat Buffers. TensorFlow Lite takes existing models and converts them into an optimized version within the sort of .tflite file.
The tf module is capable of executing Quantized TensorFlow Lite Models on the OpenMV Cam. The final output .tflite model can be directly loaded and run by your OpenMV Cam. The model and the modelβs required sratch RAM must fit within the available frame buffer stack RAM on your OpenMV Cam. Alternatively, you can also load a model onto the MicroPython Heap or the OpenMV Cam frame buffer. However, this significantly limits the model size on all OpenMV Cams.