Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Time is too high #19

Closed
skrya opened this issue Dec 21, 2017 · 6 comments
Closed

Inference Time is too high #19

skrya opened this issue Dec 21, 2017 · 6 comments

Comments

@skrya
Copy link

skrya commented Dec 21, 2017

Hi, I have directly taken your code and just modified the path to the datasets, and then ran evaluate.py to measure the inference time. I am getting an inference time of 4sec(as opposed to ~0.04sec) on Tesla X GPU. Could you please point out on what could be the reason on why this could be happening.

Thanks,
Sudhir

@hellochick
Copy link
Owner

Hey @Sudhir11292rt , can you tell me how you run the evaluation code? You can refer to this issue #4 first. May you calculate the time including model initialization

@skrya
Copy link
Author

skrya commented Dec 23, 2017

Thanks for the response @hellochick. I have looked at the #4 . But still I am getting an 'average inference time: 2.80006708418'. In evaluate.py I have changed

  1. DATA_DIRECTORY, DATA_LIST_PATH to point my local cityscapes dataset and cityscapes_val_lists.txt.
    2)I have downloaded your provided 'icnet_cityscapes_train_30k.npy' pretrained weights and correspondingly changed model_train30k.
    Then, ran python evaluate.py --model train --measure-time.

Thanks,
Sudhir

@hellochick
Copy link
Owner

@Sudhir11292rt , can you try inference.py and calculate time ?

for i in range(5):
    start = time.time()
    preds = sess.run(pred)
    print(time.time() - start)

@skrya
Copy link
Author

skrya commented Dec 23, 2017

@hellochick Here is the output for command 'python inference.py --model train --img-path ~/aachen_000087_000019_leftImg8bit.png'
3.84351301193
2.9684278965
3.43640303612
3.59244203568
3.2074649334

I am not sure, if I am making mistake somewhere?

@hellochick
Copy link
Owner

@Sudhir11292rt , I have no idea for this situation. Did there any warnings occur when you run the code? Such as OutOfMemory, etc.

@seovchinnikov
Copy link

seovchinnikov commented May 23, 2018

Yeah, it seems like its warming issue, ~2s on 1st inference, 0.04 on the second one and others (1080 Ti)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants