Replies: 1 comment 1 reply
-
You'd have to follow the documentation for "Other LLM" |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
I cannot find documentation for explain how to access local LLM without Ollama.
To Reproduce
I have downloaded CodeLlama-7b-Instruct-hf on my Mac, but there is no example on Vanna.ai documentation to describe how to use this kind of LLM.
Expected behavior
May I know is there any way to access this kind of LLM? Thanks in advance.
Error logs/Screenshots
If applicable, add logs/screenshots to give more information about the issue.
Desktop (please complete the following information where):
Additional context
Add any other context about the problem here.
Beta Was this translation helpful? Give feedback.
All reactions