Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run times far from expected #41

Open
aquatom opened this issue Feb 14, 2019 · 1 comment
Open

Run times far from expected #41

aquatom opened this issue Feb 14, 2019 · 1 comment

Comments

@aquatom
Copy link

aquatom commented Feb 14, 2019

Running the notebook with python 3 on a tx2 with JetPack 3.3. I followed the instructions and I am measuring the inference time as follow
from time import time
start = time()
output = tf_sess.run(tf_output, feed_dict={
tf_input: image[None, ...]
})
end = time()
print("Inference time: {}s".format(end-start))
scores = output[0]

Using the same examples as the notebook (inception_v1 etc), I got a inference time of 0.8 seconds, pretty far from the 7ms described.
I also used
sudo nvpmodel -m 0
sudo ~/jetson_clocks.sh

@mosheliv
Copy link

mosheliv commented May 8, 2019

Don't know if this is still relevant but the first inference is painfully slow (warmup). Usually you discard the first inference and average the next ten

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants