-
Notifications
You must be signed in to change notification settings - Fork 895
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Same error messages every time for videos (doesn't happen when making a cropped image) #366
Comments
Also pretty much every other time i use it this error pops up raise ValueError( |
I'm getting this error on my anaconda version of trying to run this. I can't figure it out either. And there is a problem with the colab version as well. |
Here is how I solved this issue. Go to "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py" I did this in conda so i edited the file in "C:\Users\Jatin\anaconda3\envs\simswap\Lib\site-packages\insightface\model_zoo" def get_model(self): with **def get_model(self): after this i had errors about numpy np.float error in reverse2original file. I changed np.float to np.float64 in three lines in that python file. |
solved it for me too! thanks! running GTX3060ti with cuda12.0 on windows 11
|
mark |
File "D:\SimSwap\SimSwap\test_video_swapsingle.py", line 58, in
app = Face_detect_crop(name='antelope', root='./insightface_func/models')
File "D:\SimSwap\SimSwap\insightface_func\face_detect_crop_single.py", line 40, in init
model = model_zoo.get_model(onnx_file)
File "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 56, in get_model
model = router.get_model()
File "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\insightface\model_zoo\model_zoo.py", line 23, in get_model
session = onnxruntime.InferenceSession(self.onnx_file, None)
File "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 347, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Users\chick\AppData\Local\Programs\Python\Python310\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 375, in _create_inference_session
raise ValueError(
ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'], ...)
The text was updated successfully, but these errors were encountered: