-
Notifications
You must be signed in to change notification settings - Fork 153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference Time is too high #19
Comments
Hey @Sudhir11292rt , can you tell me how you run the evaluation code? You can refer to this issue #4 first. May you calculate the time including model initialization |
Thanks for the response @hellochick. I have looked at the #4 . But still I am getting an 'average inference time: 2.80006708418'. In evaluate.py I have changed
Thanks, |
@Sudhir11292rt , can you try
|
@hellochick Here is the output for command 'python inference.py --model train --img-path ~/aachen_000087_000019_leftImg8bit.png' I am not sure, if I am making mistake somewhere? |
@Sudhir11292rt , I have no idea for this situation. Did there any warnings occur when you run the code? Such as OutOfMemory, etc. |
Yeah, it seems like its warming issue, ~2s on 1st inference, 0.04 on the second one and others (1080 Ti) |
Hi, I have directly taken your code and just modified the path to the datasets, and then ran evaluate.py to measure the inference time. I am getting an inference time of 4sec(as opposed to ~0.04sec) on Tesla X GPU. Could you please point out on what could be the reason on why this could be happening.
Thanks,
Sudhir
The text was updated successfully, but these errors were encountered: