Skip to content

Can't get connection to Oogabooga w/openai extension to work. #1434

@kjekro

Description

@kjekro

Describe the bug

I get some sort of LiteLLM error when I try to connect to the openai api in oogabooga.
It works fine to connect to the API from SillyTavern.

On a tangent: It looks like LiteLLM must have an API key or else there is an error message for that also, but now I have simply set "x" as the API key and that works fine in SillyTavern as well.

Reproduce

(open-interpreter) PS D:\ai\open-interpreter\open-interpreter> interpreter --api_base "http://localhost:5000" --api_key "x" --model openai/Tiger-Gemma-9B-v2s-Q5_K_M.gguf --no-llm_supports_functions

▌ A new version of Open Interpreter is available.

▌ Please run: pip install --upgrade open-interpreter

──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────

▌ Model set to openai/Tiger-Gemma-9B-v2s-Q5_K_M.gguf

Open Interpreter will require approval before running code.

Use interpreter -y to bypass this.

Press CTRL-C to exit.

d

We were unable to determine the context window of this model. Defaulting to 8000.

If your model can handle more, run interpreter --context_window {token limit} --max_tokens {max tokens per response}.

Continuing...

Traceback (most recent call last):
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\llms\openai.py", line 1035, in completion
raise e
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\llms\openai.py", line 912, in completion
return self.streaming(
^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\llms\openai.py", line 1173, in streaming
headers, response = self.make_sync_openai_chat_completion_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\llms\openai.py", line 818, in make_sync_openai_chat_completion_request
raise e
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\llms\openai.py", line 807, in make_sync_openai_chat_completion_request
raw_response = openai_client.chat.completions.with_raw_response.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\openai_legacy_response.py", line 350, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\openai_utils_utils.py", line 274, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\openai\resources\chat\completions.py", line 668, in create
return self._post(
^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\openai_base_client.py", line 1260, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\openai_base_client.py", line 937, in request
return self._request(
^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\openai_base_client.py", line 1041, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'detail': 'Not Found'}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\main.py", line 1360, in completion
raise e
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\main.py", line 1333, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\llms\openai.py", line 1042, in completion
raise OpenAIError(
litellm.llms.openai.OpenAIError: Error code: 404 - {'detail': 'Not Found'}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in run_code
File "D:\ai\open-interpreter\open-interpreter\Scripts\interpreter.exe_main
.py", line 7, in
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 588, in main
start_terminal_interface(interpreter)
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 554, in start_terminal_interface
interpreter.chat()
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\core.py", line 191, in chat
for _ in self._streaming_chat(message=message, display=display):
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\core.py", line 223, in _streaming_chat
yield from terminal_interface(self, message)
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\terminal_interface\terminal_interface.py", line 155, in terminal_interface
for chunk in interpreter.chat(message, display=False, stream=True):
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\core.py", line 259, in _streaming_chat
yield from self._respond_and_store()
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\core.py", line 318, in _respond_and_store
for chunk in respond(self):
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\respond.py", line 87, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\llm\llm.py", line 319, in run
yield from run_text_llm(self, params)
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\llm\run_text_llm.py", line 20, in run_text_llm
for chunk in llm.completions(**params):
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\llm\llm.py", line 459, in fixed_litellm_completions
raise first_error # If all attempts fail, raise the first error
^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\interpreter\core\llm\llm.py", line 436, in fixed_litellm_completions
yield from litellm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\utils.py", line 1082, in wrapper
raise e
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\utils.py", line 970, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\main.py", line 2782, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\utils.py", line 8525, in exception_type
raise e
File "D:\ai\open-interpreter\open-interpreter\Lib\site-packages\litellm\utils.py", line 6875, in exception_type
raise NotFoundError(
litellm.exceptions.NotFoundError: litellm.NotFoundError: NotFoundError: OpenAIException - Error code: 404 - {'detail': 'Not Found'}
(open-interpreter) PS D:\ai\open-interpreter\open-interpreter>

Expected behavior

To work I guess, why is this so hard.

Screenshots

image

Open Interpreter version

0.3.8

Python version

3.12.4

Operating System name and version

Win 10

Additional context

I installed OI in a venv and via the powershell method if that has anything to say.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions