-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No ONNX file found. Exporting ONNX… #54
Comments
it is accurate watch this |
Sorry, I'm not paying to join your patreon. |
Does not look like a error, It notifies you that this checkpoint does not have a ONNX file yet. What are is the rest of the console print? |
I just had this issue myself, but after some troubleshooting it seems that for whatever reason the commandline argument --medvram was causing the issue. After removing it and restarting, I was able to generate defaults. Not sure if you have that enabled or not, but it's worth a look. |
That's all that was displayed in the SD output. In the command prompt, it was throwing an error about "tensors on multiple devices". Looks like it was related to using --med-vram, because it did export correctly when I restarted SD without that flag. |
Ha, you beat me to posting that by literally one second. :) |
Sorry for the confusion, we have added the --med-vram flag to the known issues into the readme. Are we good to close? |
I think so. Thanks for checking on this. |
Which file is it set in? Can you teach me something |
Can't "export default engine". I've tried 20 different checkpoints, all return the same error immediately:
The text was updated successfully, but these errors were encountered: