New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not able to decode #4
Comments
sorry my own model doesn't open at all so could you help me how you reach to this error as the model couldn't be read in the android it says no inference graph in this path or it couldn't be read also i can't find the workspace that include the sdk path or ndk path to make bazel build |
If I have understood you correctly, I guess you have trained your model & then run export_lstm_pb.py to generate .pb file and then you are stuck after that? |
yes also i tried to train on wavnet then used export_wave_pb.py and also same error it couldn't be read i don't reach the mfcc |
ok did you run those bazel commands mentioned in the tutorial on it? After I ran them, I got a .so and .jar file which i pasted in the libs folder & then ran the code. |
I had a problem with bazel as when I tried to run on tensorflow i can't find the NDK and SDK lines in the workspace also the version of bazel was 19, not 0.5.4 so what is the versions you used for both tensor and bazel and in the workspace |
my bazel version was also 19. I'm not sure yet what exactly your error was. Maybe if you can share a screen shot of it? |
i have one more question till get a screen shot of the error what is the version of NDK and from where you get tensorflow finaly how to ad ops in the workspace |
I guess its version 11. |
anyone has a solution to my problem? |
How were you able to edit the WORKSPACE file? I didn't find any NDK or SDK lines to uncomment. |
Hi,
I am trying to fit my own model in this app. When I speak into the app, it extracts & prints the MFCC features, but crashes afterwards giving the following error:
04-03 12:56:16.754 24654-24815/org.tensorflow.demo E/TensorFlowInferenceInterface: Failed to run TensorFlow inference with inputs:[SeqLen], outputs:[SparseToDense]
04-03 12:56:16.755 24654-24815/org.tensorflow.demo E/AndroidRuntime: FATAL EXCEPTION: Thread-7556
Process: org.tensorflow.demo, PID: 24654
java.lang.IllegalArgumentException: Expects arg[0] to be int32 but float is provided
at org.tensorflow.Session.run(Native Method)
at org.tensorflow.Session.access$100(Session.java:48)
at org.tensorflow.Session$Runner.runHelper(Session.java:314)
at org.tensorflow.Session$Runner.run(Session.java:264)
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.run(TensorFlowInferenceInterface.java:228)
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.run(TensorFlowInferenceInterface.java:197)
at org.tensorflow.contrib.android.TensorFlowInferenceInterface.run(TensorFlowInferenceInterface.java:187)
at org.tensorflow.demo.SpeechActivity.recognize(SpeechActivity.java:229)
at org.tensorflow.demo.SpeechActivity.access$100(SpeechActivity.java:48)
at org.tensorflow.demo.SpeechActivity$3.run(SpeechActivity.java:193)
at java.lang.Thread.run(Thread.java:818)
Can someone please help me fix this?
The text was updated successfully, but these errors were encountered: