You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While trying to run the models server I run into the following error:
raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
ValueError: Could not load model models/openai/whisper-base with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCTC'>, <class 'transformers.models.auto.modeling_auto.AutoModelForSpeechSeq2Seq'>, <class 'transformers.models.whisper.modeling_whisper.WhisperForConditionalGeneration'>).
I tried redoing all the installation steps, removing and recreating the conda env, and making sure all python dependencies are installed and both api tokens configured. What could be going wrong?
The text was updated successfully, but these errors were encountered:
While trying to run the models server I run into the following error:
I tried redoing all the installation steps, removing and recreating the conda env, and making sure all python dependencies are installed and both api tokens configured. What could be going wrong?
The text was updated successfully, but these errors were encountered: