Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strange surface of inferenced results #11

Open
IridescentJiang opened this issue Mar 2, 2024 · 1 comment
Open

Strange surface of inferenced results #11

IridescentJiang opened this issue Mar 2, 2024 · 1 comment

Comments

@IridescentJiang
Copy link

Thanks for your great work!
I encountered some problems during inferencing.
Would you please help me?
My inference results have strange surfaces just as #7

I noticed that an ERROR occurred, although it didn't stop the inference progress:

Resume MLP weights from ./data/ckpt/GTA.ckpt
Resume normal model from ./data/ckpt/normal.ckpt
Using pixie as HPS Estimator

Dataset Size: 5
  0%|                                                                                                                                                            | 0/5 [00:00<?, ?it/s]
2024-03-02 16:02:28.809516226 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:515 CreateExecutionProviderInstance] Failed to create TensorrtExecutionProvider. 
Please reference https://onnxruntime.ai/docs/execution-providers/TensorRT-ExecutionProvider.html#requirements to ensure all dependencies are met.
1eca7a73c3c61d9debde493de37c7d99:   0%|                                                                                                                          | 0/5 [00:06<?, ?it/s
Body Fitting --- normal: 0.089 | silhouette: 0.043 | Total: 0.132:  12%|█████████▎                                                                    | 12/100 [00:01<00:13,  6.32it/s]
1eca7a73c3c61d9debde493de37c7d99:   0%|                                                                                                                          | 0/5 [00:08<?, ?it/s]

Is it normal that this error ocurred during inferencing?

I tried to change the onnxruntime-gpu and TensorRT's version but it didn't work.

My environment is:
CUDA 11.7
pytorch 1.13.1
onnxruntime-gpu 1.14
TensorRT 8.5.3.1

@River-Zhang
Copy link
Owner

It should not happen and the body fitting should go 100% steps before the next stage. We haven't met this problem before. You could debug the body fitting process to see why it stoped.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants