You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Running the notebook with python 3 on a tx2 with JetPack 3.3. I followed the instructions and I am measuring the inference time as follow from time import time start = time() output = tf_sess.run(tf_output, feed_dict={ tf_input: image[None, ...] }) end = time() print("Inference time: {}s".format(end-start)) scores = output[0]
Using the same examples as the notebook (inception_v1 etc), I got a inference time of 0.8 seconds, pretty far from the 7ms described.
I also used sudo nvpmodel -m 0 sudo ~/jetson_clocks.sh
The text was updated successfully, but these errors were encountered:
Don't know if this is still relevant but the first inference is painfully slow (warmup). Usually you discard the first inference and average the next ten
Running the notebook with python 3 on a tx2 with JetPack 3.3. I followed the instructions and I am measuring the inference time as follow
from time import time
start = time()
output = tf_sess.run(tf_output, feed_dict={
tf_input: image[None, ...]
})
end = time()
print("Inference time: {}s".format(end-start))
scores = output[0]
Using the same examples as the notebook (inception_v1 etc), I got a inference time of 0.8 seconds, pretty far from the 7ms described.
I also used
sudo nvpmodel -m 0
sudo ~/jetson_clocks.sh
The text was updated successfully, but these errors were encountered: