Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could this benefit 'TensorFlow Lite for MicroControllers' models #22

Closed
NicoJuicy opened this issue Mar 22, 2022 · 4 comments
Closed

Could this benefit 'TensorFlow Lite for MicroControllers' models #22

NicoJuicy opened this issue Mar 22, 2022 · 4 comments
Labels
enhancement New feature or request

Comments

@NicoJuicy
Copy link

NicoJuicy commented Mar 22, 2022

Models on microcontrollers ( eg. RPi Pico - ARM based) are very hardware constrained and could benefit greatly from this.
And i know it's possible to convert a TF model to a TF-Lite one.

Could 'nebullvm' be applied to the "TF lite for Microcontroller" flow to improve interference and/or is this a supported use-case already?

I don't see TF Lite supported currently.

@NicoJuicy NicoJuicy changed the title Could this benefit TensorFlow Lite for MicroControllers Could this benefit 'TensorFlow Lite for MicroControllers' models Mar 22, 2022
@diegofiori
Copy link
Collaborator

Hi @NicoJuicy. We are not currently supporting TF Lite, but this can definitely be an interesting feature to include in the future! In the coming days we will draw up a roadmap for the planned releases of nebullvm and we can think about adding support for TFLite as well.

@diegofiori diegofiori added the enhancement New feature or request label Mar 23, 2022
@AvivSham
Copy link

@morgoth95 thank you for this wonderful repo!
Supporting edge devices and deployment (TF-lite / CoreML) should be given top priority since we are really looking for speed and reduced computational cost when working with edge devices (as opposed to cloud training, which is important, but less so).

@NicoJuicy
Copy link
Author

Hi, thanks for the response.

I do want to admit that I'm working with tensorflow lite micro, which really means very low powered devices.

But it also seems the best match for this use-case, just a guess

@Nick-infinity
Copy link

Hi @NicoJuicy. We are not currently supporting TF Lite, but this can definitely be an interesting feature to include in the future! In the coming days we will draw up a roadmap for the planned releases of nebullvm and we can think about adding support for TFLite as well.

Hello @morgoth95 , I can see TOT commit for TFLite backend. Are TFLite models supported now? if yes, can we please reflect updates in docs as well?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants