-
Notifications
You must be signed in to change notification settings - Fork 74.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Errors loading inception v3 in iOS example #3480
Comments
Hi Piso. Use I also have been trying to get the iOS sample to work with Inception v3 but I'm getting the sample prediction no matter the image captured by the camera. So if you manage to get it running may you please help me with my issue #3446. Thanks |
Here is the full code you need to use |
@Piso try running this from the root of your tensorflow repo:
Then change the name of the graph being loaded in
to
|
@Tugees @shrutisharmavsco and @Piso
Thanks, |
|
Thanks @shrutisharmavsco - I used the same command you posted (actually I tried both So @shrutisharmavsco you can run the stripped V3 model on actual iOS device with recognition result generated? Anything you did differently from what I did above? Thanks! |
@jeffxtang that all sounds like the right steps to me. I don't think I did anything differently. I was running it on an iPhone 5C. I do get memory warnings generated but the app does not crash - it still generates the tags. I haven't quantized the model yet, though. |
Thanks @shrutisharmavsco for the info. When you have the chance can you try quantizing the model and see if the memory warnings will disappear running on your iPhone 5C or if you'll hit problem similar to #3619? |
@jeffxtang running the quantized model made the memory warnings disappear on the iPhone 5C |
@shrutisharmavsco cool. I'm able to run the quantized model on my iPhone 6 too without memory warnings or other problems. |
@jeffxtang are you noticing a speed decrease running the quantized model vs the non-quantized one? |
No @shrutisharmavsco the performance seems to be the same to me. You can check out my recently released iOS app which uses a quantized model for comparison: https://itunes.apple.com/us/app/dog-breeds-recognition-powered/id1150923794?mt=8 |
For those of you who may be interested, I just posted a blog documenting the whole process of using a retrained inception v3 model for my app above at http://jeffxtang.github.io |
I've got a tutorial explaining how to get this running up at https://petewarden.com/2016/09/27/tensorflow-for-mobile-poets/ now, so I'm going to close this bug. Please open new bugs if there are issues with the process. Thanks @jeffxtang for your work on this too! |
Great tutorial @petewarden! I noticed you use |
Yes, those are the recommended approaches. I'm working on a general description of the process, I'll email you a draft. |
Environment info
Operating System: Mac OS X / iOS
If installed from source, provide
git rev-parse HEAD
) : fc91629bazel version
: Build label: 0.3.0-homebrewSteps to reproduce
camera
iOS projectRunning model failed:Invalid argument: Session was not created with a graph before Run()!
Running model failed:Invalid argument: No OpKernel was registered to support Op 'DecodeJpeg' with these attrs [[Node: DecodeJpeg = DecodeJpeg[acceptable_fraction=1, channels=3, fancy_upscaling=true, ratio=1, try_recover_truncated=false](DecodeJpeg/contents)]]
What have you tried?
bazel build tensorflow/python/tools:strip_unused && \ bazel-bin/tensorflow/python/tools/strip_unused \ --input_graph=your_retrained_graph.pb \ --output_graph=stripped_graph.pb \ --input_node_names=Mul \ --output_node_names=final_result \ --input_binary=true
However, I receive the following error:
/tensorflow/bazel-bin/tensorflow/python/tools/strip_unused.runfiles/org_tensorflow/tensorflow/python/framework/graph_util.py", line 156, in extract_sub_graph assert d in name_to_node_map, "%s is not in graph" % d AssertionError: final_result is not in graph
The text was updated successfully, but these errors were encountered: