-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to replace the default LLaMA3 model with my fine-tuned LLaMA3? #220
Comments
@win4r the tutorial you linked, demonstrates running llama3 using Ollama. You can use any model that's available on Ollama using Phidata by passing the model id as a param to the Ollama class Example: Can you share more details about your fine-tuned llama3 model? How are you hosting/running the model? |
After replacing the fine-tuning model, I received an error ResponseError: model 'llama3' not found, try pulling it first |
|
I have same problem. mark this issue and wait an answer. |
If I follow the steps outlined here "https://github.com/phidatahq/phidata/tree/main/cookbook/llms/ollama/rag",
can I replace the default LLaMA3 model with my fine-tuned LLaMA3?
The text was updated successfully, but these errors were encountered: