New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tensorflow lite - object detection - ssd-mobilenet-v1 #14731
Comments
Answered here: #14761. |
I'm currently looking into ssd-mobilenet support, will leave this open to track. |
That's awesome, thanks! |
@andrewharp I'm more than happy to follow your lead and spend time to contribute to the ssd-mobilenet support. Or you got this and you just need someone to help testing, I can do that too. BTW, is it going to be available soon? Do you have an ETA? Thanks so very much! |
It has been 14 days with no activity and this issue has an assignee.Please update the label and/or status accordingly. |
Yes, is the ssd-mobilenet support there already or not yet? |
Nagging Assigneee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
Any update on this? |
Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
1 similar comment
Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
Hi, It would be very helpful if we are given an update on this work. Thanks so much! |
@yucheeling |
This is functional internally now; should have something out in the next week or two. |
This is awesome! Again, thanks so much! Oh! one more thing, will this be in float and 8-bit? |
This is awesome. It seems to me that the performance gains and memory footprint of TFLite would be welcome in many web based inference scenarios. Not to mention dare I say a JavaScript WebGL based implementation down the track (i bet my left arm you guys already have this in the works). |
@mpeniak Also do you mind if I ask how many images per class did you find gave you good enough results for transfer learning to new categories? |
@andrewharp Is ssd-mobilenet on tfLite ok now? How can I use it ? |
@andrewharp - I'm also wondering if you have any update on where it is being released? Any information would be greatly appreciated. |
@andrewharp Anything we can do to help? Buy you a coffee or some pizza? 😁Sorry i'm sure extra @ msg's aren't helping. 😬 |
@andrewharp any update on this? |
I've got a commit for porting the entire TensorFlow Android demo currently under internal review, including SSD object detection, so with any luck should be out in the next few days! |
Andrew, Do you have the code deployed for Tensorflow Lite? I am new to this framework and want to figure out how to do object detection (both identification and bounding box info) not just identification probabilities for entire image. Can you advise? I am trying to understand how this alters for Tensorflow Lite as the demo shows using the tensorflow lite class (org.tensorflow.lite.Interpreter) for classification (but, not localization). |
SSD object detection in TF Lite is live now! See this comment for details. |
how to define own model for object_detection? like mobilenet_ssd v2 |
Hie @mpeniak Thank you |
Hi guys,
I have trained a custom ssd-mobilenet-v1 (300x300 input) and currently running it via Tensorflow Android demo (Tensorflow mobile). I would love to convert this model to the lite format and possibly quantize it and run it via Tensorflow Lite to see how much has the performance improved. Currently the inference takes around 400-500ms on Google Pixel (version 1).
Could you please let me know what's the best way to deploy my custom model for object detection?
Thank you very much in advance!
Martin Peniak
The text was updated successfully, but these errors were encountered: