-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama: 'llama2' not found, try pulling it first #311
Comments
It's defaulting to using OpenAI for the core model. Can you set |
STEP 99 Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
Traceback (most recent call last): During handling of the above exception, another exception occurred: Traceback (most recent call last):
Exited before finishing but when i run ollama list it gives r3versein@DESKTOP-IL31CM9:~$ ollama list |
Oh I bet it's related to this: #285 |
Actually according to the docs, you should be fine: https://docs.litellm.ai/docs/providers/ollama So it seems like an ollama issue... |
It looks like litellm thinks the model name is just |
yes both done. Still same issue. |
I got it running using
@R3verseIN can you check your open ports to see if |
Fixed the issue. |
I'm running llama2 in LMStudio and I'm running theses:
And I get this error:
If anyone has ideas on what to do, I'm all ears. So far, everything works before sending the command in the UI where i get this: And i get this in the LMStudio:
|
For
Try running
Otherwise, it seems like your ollama server isn't behaving as expected. LiteLLM expects to get results from the endpoint I'm not familiar with LMStudio but my guess is you have to run ollama without LMStudio for it to work |
Think I found why here: llama 2 does not support Embedding? |
Hmm...I think we're using llamaindex for the embeddings, not litellm |
I get this error now: Started from scratch with LMStudio (Running nitsuai/llama-2-70b-Guanaco-QLoRA-GGUF/llama-2-70b-guanaco-qlora.Q3_K_S.gguf) and these parameters for LiteLLM: I'm not taking any more of your time since I do not have sufficient knowledge to have any constructive contribution to the project but if you wish to know more about my setup, do not hesitate. Thanks for the help and great project! |
This is an ollama issue, it means the model has not started running on the ollama server |
@10htts, LMStudio is already openai compatible. So pass model= |
Closing in favor of #417 |
**STEP 3
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Traceback (most recent call last):
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 376, in completion
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 312, in completion
openai_client = OpenAI(
File "/home/r3versein/.local/lib/python3.10/site-packages/openai/_client.py", line 98, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/main.py", line 989, in completion
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/main.py", line 962, in completion
response = openai_chat_completions.completion(
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 384, in completion
raise OpenAIError(status_code=500, message=traceback.format_exc())
litellm.llms.openai.OpenAIError: Traceback (most recent call last):
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 376, in completion
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 312, in completion
openai_client = OpenAI(
File "/home/r3versein/.local/lib/python3.10/site-packages/openai/_client.py", line 98, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/r3versein/OpenDevin/opendevin/controller/init.py", line 85, in step
action = self.agent.step(state)
File "/home/r3versein/OpenDevin/agenthub/langchains_agent/langchains_agent.py", line 172, in step
resp = self.llm.completion(messages=messages)
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/utils.py", line 2796, in wrapper
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/utils.py", line 2693, in wrapper
result = original_function(*args, **kwargs)
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/main.py", line 2093, in completion
raise exception_type(
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/utils.py", line 8283, in exception_type
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/utils.py", line 7069, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: OpenAIException - Traceback (most recent call last):
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 376, in completion
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 312, in completion
openai_client = openai(
File "/home/r3versein/.local/lib/python3.10/site-packages/openai/_client.py", line 98, in init
raise openaiError(
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
==============
STEP 4
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Traceback (most recent call last):
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 376, in completion
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 312, in completion
openai_client = OpenAI(
File "/home/r3versein/.local/lib/python3.10/site-packages/openai/_client.py", line 98, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/main.py", line 989, in completion
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/main.py", line 962, in completion
response = openai_chat_completions.completion(
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 384, in completion
raise OpenAIError(status_code=500, message=traceback.format_exc())
litellm.llms.openai.OpenAIError: Traceback (most recent call last):
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 376, in completion
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 312, in completion
openai_client = OpenAI(
File "/home/r3versein/.local/lib/python3.10/site-packages/openai/_client.py", line 98, in init
raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/r3versein/OpenDevin/opendevin/controller/init.py", line 85, in step
action = self.agent.step(state)
File "/home/r3versein/OpenDevin/agenthub/langchains_agent/langchains_agent.py", line 172, in step
resp = self.llm.completion(messages=messages)
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/utils.py", line 2796, in wrapper
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/utils.py", line 2693, in wrapper
result = original_function(*args, **kwargs)
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/main.py", line 2093, in completion
raise exception_type(
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/utils.py", line 8283, in exception_type
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/utils.py", line 7069, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: OpenAIException - Traceback (most recent call last):
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 376, in completion
raise e
File "/home/r3versein/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 312, in completion
openai_client = openai(
File "/home/r3versein/.local/lib/python3.10/site-packages/openai/_client.py", line 98, in init
raise openaiError(
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
the executed command is
export LLM_EMBEDDING_MODEL="llama2"
export LLM_BASE_URL="http://localhost:11434"
export LLM_API_KEY=""
export WORKSPACE_DIR="/home/r3versein/work/"
uvicorn opendevin.server.listen:app --port 3000
The text was updated successfully, but these errors were encountered: