Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Same error messages every time for videos (doesn't happen when making a cropped image) #366

Open
Nugget2920 opened this issue Dec 22, 2022 · 5 comments

Comments

@Nugget2920
Copy link

File "D:\SimSwap\SimSwap\test_video_swapsingle.py", line 58, in
app = Face_detect_crop(name='antelope', root='./insightface_func/models')
File "D:\SimSwap\SimSwap\insightface_func\face_detect_crop_single.py", line 40, in init
model = model_zoo.get_model(onnx_file)
File "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 56, in get_model
model = router.get_model()
File "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 23, in get_model
session = onnxruntime.InferenceSession(self.onnx_file, None)
File "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 347, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 375, in _create_inference_session
raise ValueError(
ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)

@Nugget2920
Copy link
Author

Also pretty much every other time i use it this error pops up

raise ValueError(
ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)

@Nugget2920 Nugget2920 changed the title Same error messages every time for videos (doesn't happen when making an image) Same error messages every time for videos (doesn't happen when making a cropped image) Dec 22, 2022
@CoderDudeBrent
Copy link

I'm getting this error on my anaconda version of trying to run this. I can't figure it out either. And there is a problem with the colab version as well.

@strider1716
Copy link

strider1716 commented Mar 8, 2023

Here is how I solved this issue.

Go to "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py"
Edit the file on the on the line 56 or somewhere above it.

I did this in conda so i edited the file in "C:\Users\Jatin\anaconda3\envs\simswap\Lib\site-packages\insightface\model_zoo"
I edited the file model_zoo.py where i replaced.

def get_model(self):
session = onnxruntime.InferenceSession()

with

**def get_model(self):
session = onnxruntime.InferenceSession(self.onnx_file, providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider'])
**

after this i had errors about numpy np.float error in reverse2original file. I changed np.float to np.float64 in three lines in that python file.

@gillesvandevoorde
Copy link

solved it for me too! thanks!

running GTX3060ti with cuda12.0 on windows 11

  • using anaconda and preparation
  • Python 3.8
  • conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia
  • changes highlighted above by strider1716

@Liam6666
Copy link

mark

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants