Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tensorflow lite - object detection - ssd-mobilenet-v1 #14731

Closed
mpeniak opened this issue Nov 20, 2017 · 25 comments
Closed

Tensorflow lite - object detection - ssd-mobilenet-v1 #14731

mpeniak opened this issue Nov 20, 2017 · 25 comments
Assignees
Labels
comp:lite TF Lite related issues type:feature Feature requests

Comments

@mpeniak
Copy link

mpeniak commented Nov 20, 2017

Hi guys,

I have trained a custom ssd-mobilenet-v1 (300x300 input) and currently running it via Tensorflow Android demo (Tensorflow mobile). I would love to convert this model to the lite format and possibly quantize it and run it via Tensorflow Lite to see how much has the performance improved. Currently the inference takes around 400-500ms on Google Pixel (version 1).

Could you please let me know what's the best way to deploy my custom model for object detection?

Thank you very much in advance!

Martin Peniak

@miaout17 miaout17 added the comp:lite TF Lite related issues label Nov 20, 2017
@andrewharp andrewharp self-assigned this Nov 20, 2017
@gargn
Copy link

gargn commented Nov 28, 2017

Answered here: #14761.

@gargn gargn closed this as completed Nov 28, 2017
@andrewharp andrewharp reopened this Nov 28, 2017
@andrewharp
Copy link
Contributor

I'm currently looking into ssd-mobilenet support, will leave this open to track.

@mpeniak
Copy link
Author

mpeniak commented Nov 29, 2017

That's awesome, thanks!

@aselle aselle added type:feature Feature requests stat:awaiting tensorflower Status - Awaiting response from tensorflower labels Nov 29, 2017
@h8907283
Copy link

h8907283 commented Dec 1, 2017

@andrewharp I'm more than happy to follow your lead and spend time to contribute to the ssd-mobilenet support. Or you got this and you just need someone to help testing, I can do that too. BTW, is it going to be available soon? Do you have an ETA? Thanks so very much!

@tensorflowbutler
Copy link
Member

It has been 14 days with no activity and this issue has an assignee.Please update the label and/or status accordingly.

@mpeniak
Copy link
Author

mpeniak commented Dec 21, 2017

Yes, is the ssd-mobilenet support there already or not yet?

@tensorflowbutler
Copy link
Member

Nagging Assigneee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@mpeniak
Copy link
Author

mpeniak commented Jan 23, 2018

Any update on this?

@tensorflowbutler
Copy link
Member

Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

1 similar comment
@tensorflowbutler
Copy link
Member

Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@h8907283
Copy link

Hi, It would be very helpful if we are given an update on this work. Thanks so much!

@domidataguy
Copy link

@yucheeling

@andrewharp
Copy link
Contributor

This is functional internally now; should have something out in the next week or two.

@h8907283
Copy link

This is awesome! Again, thanks so much!

Oh! one more thing, will this be in float and 8-bit?

@tensorflowbutler tensorflowbutler removed the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Mar 10, 2018
@madhavajay
Copy link

This is awesome.
Just a side thought, is there any interest in providing TF Lite for other platforms like the python client that run for server side inference?

It seems to me that the performance gains and memory footprint of TFLite would be welcome in many web based inference scenarios. Not to mention dare I say a JavaScript WebGL based implementation down the track (i bet my left arm you guys already have this in the works).

@madhavajay
Copy link

@mpeniak Also do you mind if I ask how many images per class did you find gave you good enough results for transfer learning to new categories?

@offbye
Copy link

offbye commented Mar 20, 2018

@andrewharp Is ssd-mobilenet on tfLite ok now? How can I use it ?

@pathwayai
Copy link

@andrewharp - I'm also wondering if you have any update on where it is being released? Any information would be greatly appreciated.

@madhavajay
Copy link

@andrewharp Anything we can do to help? Buy you a coffee or some pizza? 😁Sorry i'm sure extra @ msg's aren't helping. 😬

@pathwayai
Copy link

@andrewharp any update on this?

@andrewharp
Copy link
Contributor

I've got a commit for porting the entire TensorFlow Android demo currently under internal review, including SSD object detection, so with any luck should be out in the next few days!

@grewe
Copy link

grewe commented Mar 29, 2018

Andrew,

Do you have the code deployed for Tensorflow Lite? I am new to this framework and want to figure out how to do object detection (both identification and bounding box info) not just identification probabilities for entire image. Can you advise? I am trying to understand how this alters for Tensorflow Lite as the demo shows using the tensorflow lite class (org.tensorflow.lite.Interpreter) for classification (but, not localization).

@andrewharp
Copy link
Contributor

SSD object detection in TF Lite is live now! See this comment for details.

@cjr0106
Copy link

cjr0106 commented Sep 12, 2018

how to define own model for object_detection? like mobilenet_ssd v2

@rash1994
Copy link

rash1994 commented Sep 3, 2019

Hi guys,

I have trained a custom ssd-mobilenet-v1 (300x300 input) and currently running it via Tensorflow Android demo (Tensorflow mobile). I would love to convert this model to the lite format and possibly quantize it and run it via Tensorflow Lite to see how much has the performance improved. Currently the inference takes around 400-500ms on Google Pixel (version 1).

Could you please let me know what's the best way to deploy my custom model for object detection?

Thank you very much in advance!

Martin Peniak

Hie @mpeniak
I trained MobilenetSSDV2 lite model, the size of tflite model is 3MB which is less than the one they used for COCO SSD. There is no other problem else the inference time which is 500ms for my custom model. Could you please help in in this ?

Thank you
Rashmi Sharma

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues type:feature Feature requests
Projects
None yet
Development

No branches or pull requests