-
-
Notifications
You must be signed in to change notification settings - Fork 81
Created RunPod template for easy deploy #58
Comments
Hi @kodxana , thanks for the blog post! The directory for storing/caching model is specified by the Here are some recommended environment variables to be included in the template:
For a complete list of environment variables, please refer to the Dockerfile. |
Updated template to include recommended settings question is GPT-Nano model compatible by chance? |
I haven't tried yet, but theoretically Basaran supports all 🤗 Transformers-based models. If the model is not available on HF hub, you can load it by specifying the You may also want to set |
Great to hear it's working well for you, @kodxana ! |
First of all I want to say I love Basaran is more OpenAI than OpenAI itself :)
I decided to give a go and made template for Basaran to run on RunPod GPU service and it works well including UI and API endpoints.
Hope that helps users who want to use it without GPU access to also be able to enjoy it.
https://runpod.io/gsc?template=7ito7h393l&ref=vfker49t
I will be publishing blog post soon so will share link later.
I also have question about location of the models. For now container saves model to temp storage if you let me know where models are being saved I will adjust template to allow saving to volume storage so users can avoid downloading models every time.
Thank you for amazing work again :)
The text was updated successfully, but these errors were encountered: