Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run with litellm #10

Closed
superuben opened this issue Dec 4, 2023 · 8 comments
Closed

Run with litellm #10

superuben opened this issue Dec 4, 2023 · 8 comments
Labels
solution provided Solution provided, waiting for user feedback before closing it.

Comments

@superuben
Copy link

/home/user/TaskWeaver/project/taskweaver_config.json
{
"llm.api_base": "http://0.0.0.0:8000/v1",
"llm.api_key": "",
"llm.model": "gpt-3.5-turbo"
}

TaskWeaver$ python -m taskweaver -p ./project/


|_ | _ | | _ | | / / ____ __ _____ _____
| |/ / __| |/ /| | /| / / _ \/ __/ | / / _ / /
| | (
| _
\ < | |/ |/ / __/ /
/ /| |/ / __/ /
|
|_
,|/|_|/|/_/_,/ |/_/_/

TaskWeaver: I am TaskWeaver, an AI assistant. To get started, could you please enter your request?
Human: heya
Error: Cannot process your request due to Exception: OpenAI API request failed to connect: Connection error.
Traceback (most recent call last):
File "/home/user/TaskWeaver/taskweaver/llm/init.py", line 146, in wrapper
raise Exception("cassette_mode or cassette_path is not set")
Exception: cassette_mode or cassette_path is not set

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions
yield
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 142, in _send_request_headers
event = h11.Request(
^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/h11/_events.py", line 96, in init
self, "headers", normalize_and_validate(headers, _parsed=_parsed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/h11/_headers.py", line 164, in normalize_and_validate
validate(_field_value_re, value, "Illegal header value {!r}", value)
File "/home/user/.local/lib/python3.11/site-packages/h11/_util.py", line 91, in validate
raise LocalProtocolError(msg)
h11._util.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions
yield
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 228, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 268, in handle_request
raise exc
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 251, in handle_request
response = connection.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 103, in handle_request
return self._connection.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 133, in handle_request
raise exc
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 92, in handle_request
self._send_request_headers(**kwargs)
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 141, in _send_request_headers
with map_exceptions({h11.LocalProtocolError: LocalProtocolError}):
File "/home/user/miniconda3/lib/python3.11/contextlib.py", line 155, in exit
self.gen.throw(typ, value, traceback)
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 882, in _request
response = self._client.send(
^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 901, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 929, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 966, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 1002, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 227, in handle_request
with map_httpcore_exceptions():
File "/home/user/miniconda3/lib/python3.11/contextlib.py", line 155, in exit
self.gen.throw(typ, value, traceback)
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.LocalProtocolError: Illegal header value b'Bearer '

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions
yield
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 142, in _send_request_headers
event = h11.Request(
^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/h11/_events.py", line 96, in init
self, "headers", normalize_and_validate(headers, _parsed=_parsed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/h11/_headers.py", line 164, in normalize_and_validate
validate(_field_value_re, value, "Illegal header value {!r}", value)
File "/home/user/.local/lib/python3.11/site-packages/h11/_util.py", line 91, in validate
raise LocalProtocolError(msg)
h11._util.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions
yield
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 228, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 268, in handle_request
raise exc
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 251, in handle_request
response = connection.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 103, in handle_request
return self._connection.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 133, in handle_request
raise exc
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 92, in handle_request
self._send_request_headers(**kwargs)
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 141, in _send_request_headers
with map_exceptions({h11.LocalProtocolError: LocalProtocolError}):
File "/home/user/miniconda3/lib/python3.11/contextlib.py", line 155, in exit
self.gen.throw(typ, value, traceback)
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 882, in _request
response = self._client.send(
^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 901, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 929, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 966, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 1002, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 227, in handle_request
with map_httpcore_exceptions():
File "/home/user/miniconda3/lib/python3.11/contextlib.py", line 155, in exit
self.gen.throw(typ, value, traceback)
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.LocalProtocolError: Illegal header value b'Bearer '

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions
yield
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 142, in _send_request_headers
event = h11.Request(
^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/h11/_events.py", line 96, in init
self, "headers", normalize_and_validate(headers, _parsed=_parsed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/h11/_headers.py", line 164, in normalize_and_validate
validate(_field_value_re, value, "Illegal header value {!r}", value)
File "/home/user/.local/lib/python3.11/site-packages/h11/_util.py", line 91, in validate
raise LocalProtocolError(msg)
h11._util.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions
yield
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 228, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 268, in handle_request
raise exc
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 251, in handle_request
response = connection.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 103, in handle_request
return self._connection.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 133, in handle_request
raise exc
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 92, in handle_request
self._send_request_headers(**kwargs)
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_sync/http11.py", line 141, in _send_request_headers
with map_exceptions({h11.LocalProtocolError: LocalProtocolError}):
File "/home/user/miniconda3/lib/python3.11/contextlib.py", line 155, in exit
self.gen.throw(typ, value, traceback)
File "/home/user/.local/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 882, in _request
response = self._client.send(
^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 901, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 929, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 966, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_client.py", line 1002, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 227, in handle_request
with map_httpcore_exceptions():
File "/home/user/miniconda3/lib/python3.11/contextlib.py", line 155, in exit
self.gen.throw(typ, value, traceback)
File "/home/user/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.LocalProtocolError: Illegal header value b'Bearer '

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/user/TaskWeaver/taskweaver/llm/init.py", line 423, in chat_completion
res: Any = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_utils/_utils.py", line 301, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 598, in create
return self._post(
^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 1096, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 856, in request
return self._request(
^^^^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 929, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 966, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 929, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 966, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/user/miniconda3/lib/python3.11/site-packages/openai/_base_client.py", line 938, in _request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/user/TaskWeaver/taskweaver/session/session.py", line 124, in send_message
post = _send_message(post.send_to, post)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/TaskWeaver/taskweaver/session/session.py", line 96, in _send_message
reply_post = self.planner.reply(
^^^^^^^^^^^^^^^^^^^
File "/home/user/TaskWeaver/taskweaver/planner/planner.py", line 179, in reply
llm_output = self.llm_api.chat_completion(chat_history, use_backup_engine=use_back_up_engine)["content"]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/TaskWeaver/taskweaver/llm/init.py", line 160, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/home/user/TaskWeaver/taskweaver/llm/init.py", line 453, in chat_completion
raise Exception(f"OpenAI API request failed to connect: {e}")
Exception: OpenAI API request failed to connect: Connection error.

Human:

@liqul
Copy link
Contributor

liqul commented Dec 4, 2023

@superuben From the error messages you posted, it looks like an empty api key was configured. Could you please check if you have successfully configured your OpenAI api key?

@superuben
Copy link
Author

superuben commented Dec 4, 2023

look all the way at the top of my message "llm.api_key": "" that config is my own input. i run litellm at port 8000 with an openai compatible endpoint so no key is needed. curling it works fine
image

@superuben
Copy link
Author

from another terminal:

python -m taskweaver -p ./project/
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in _run_code
File "/home/user/TaskWeaver/taskweaver/main.py", line 1, in
from .cli import main
File "/home/user/TaskWeaver/taskweaver/cli/main.py", line 1, in
from .cli import taskweaver
File "/home/user/TaskWeaver/taskweaver/cli/cli.py", line 3, in
from ..app import TaskWeaverApp
File "/home/user/TaskWeaver/taskweaver/app/init.py", line 1, in
from .app import TaskWeaverApp
File "/home/user/TaskWeaver/taskweaver/app/app.py", line 12, in
from taskweaver.session.session import Session
File "/home/user/TaskWeaver/taskweaver/session/init.py", line 1, in
from .session import Session
File "/home/user/TaskWeaver/taskweaver/session/session.py", line 7, in
from taskweaver.code_interpreter import CodeInterpreter
File "/home/user/TaskWeaver/taskweaver/code_interpreter/init.py", line 1, in
from .code_interpreter import CodeInterpreter
File "/home/user/TaskWeaver/taskweaver/code_interpreter/code_interpreter.py", line 6, in
from taskweaver.code_interpreter.code_generator import (
File "/home/user/TaskWeaver/taskweaver/code_interpreter/code_generator/init.py", line 1, in
from .code_generator import CodeGenerator, CodeGeneratorConfig, format_code_revision_message
File "/home/user/TaskWeaver/taskweaver/code_interpreter/code_generator/code_generator.py", line 8, in
from taskweaver.llm import LLMApi
File "/home/user/TaskWeaver/taskweaver/llm/init.py", line 6, in
from openai import AzureOpenAI, OpenAI
ImportError: cannot import name 'AzureOpenAI' from 'openai' (/home/user/.venv/lib/python3.11/site-packages/openai/init.py)

@liqul
Copy link
Contributor

liqul commented Dec 4, 2023

We haven't tested under litellm, so not sure if it would work. Previously, we actually ignored the api_base configuration if api_type is openai. Will fix this later.

@zhangxu0307
Copy link
Contributor

Fixed api_base issue for openai api_type. Please have a try.

@ShilinHe ShilinHe added the solution provided Solution provided, waiting for user feedback before closing it. label Dec 4, 2023
@superuben
Copy link
Author

TaskWeaver: I am TaskWeaver, an AI assistant. To get started, could you please enter your request?
Human: heya
Error: Cannot process your request due to Exception: 'NoneType' object is not subscriptable
Traceback (most recent call last):
File "/home/user/TaskWeaver/taskweaver/session/session.py", line 124, in send_message
post = _send_message(post.send_to, post)
File "/home/user/TaskWeaver/taskweaver/session/session.py", line 96, in _send_message
reply_post = self.planner.reply(
File "/home/user/TaskWeaver/taskweaver/planner/planner.py", line 179, in reply
llm_output = self.llm_api.chat_completion(chat_history, use_backup_engine=use_back_up_engine)["content"]
File "/home/user/TaskWeaver/taskweaver/llm/init.py", line 292, in chat_completion
oai_response = res.choices[0].message
TypeError: 'NoneType' object is not subscriptable

===========

litellm debug:
An error occurred: generate_text() got an unexpected keyword argument 'response_format'

Debug this by setting --debug, e.g. litellm --model gpt-3.5-turbo --debug

Custom Logger - final response object: None

@zhangxu0307
Copy link
Contributor

TaskWeaver: I am TaskWeaver, an AI assistant. To get started, could you please enter your request? Human: heya Error: Cannot process your request due to Exception: 'NoneType' object is not subscriptable Traceback (most recent call last): File "/home/user/TaskWeaver/taskweaver/session/session.py", line 124, in send_message post = _send_message(post.send_to, post) File "/home/user/TaskWeaver/taskweaver/session/session.py", line 96, in _send_message reply_post = self.planner.reply( File "/home/user/TaskWeaver/taskweaver/planner/planner.py", line 179, in reply llm_output = self.llm_api.chat_completion(chat_history, use_backup_engine=use_back_up_engine)["content"] File "/home/user/TaskWeaver/taskweaver/llm/init.py", line 292, in chat_completion oai_response = res.choices[0].message TypeError: 'NoneType' object is not subscriptable

===========

litellm debug: An error occurred: generate_text() got an unexpected keyword argument 'response_format'

Debug this by setting --debug, e.g. litellm --model gpt-3.5-turbo --debug

Custom Logger - final response object: None

The issue appears to be associated with the incompatibility of litellm and OpenAI's 'response_format' argument. The latest OpenAI API supports both 'response_object' and 'text' modes, but it is unclear whether it can adapt to the litellm API. If you are experiencing a similar problem, feel free to submit a new issue for tracking purposes.

@ShilinHe ShilinHe changed the title cassette_mode or cassette_path is not set Run with litellm Dec 7, 2023
@ShilinHe
Copy link
Collaborator

ShilinHe commented Jan 2, 2024

LiteLLM is now supported in TaskWeaver, please follow the docs for more information.
Close the issue as no more activity.

@ShilinHe ShilinHe closed this as completed Jan 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solution provided Solution provided, waiting for user feedback before closing it.
Projects
None yet
Development

No branches or pull requests

4 participants