-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: DAY 1 SUPPORT Add Claude -3 #2314
Comments
We'll need to add the messages endpoint for claude3 - it fails on their completion endpoint Error calling model: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"\"claude-3-opus-20240229\" is not supported on this API. Please use the Messages API instead."}}
|
I don't think the legacy text completion endpoint supports claude 3. We should move to using the anthropic 2.1 endpoint - #1209 |
Action Items:
|
PR to add initial support : #2315 |
Model Pricing info here for reference https://docs.anthropic.com/claude/docs/models-overview#meet-claude-3 |
I'll take care of the image + tool calling cost tracking in my PR |
Notes about the /messages endpoint
For v0 PR not addressing no.3 |
Sorry if this is a silly question but does day 1 support mean you guy are hoping to get this done today? Just wondering if I should attempt to hack something myself. |
@ashot it will be live in 1-2 hours, PR for v0 support is here: https://github.com/BerriAI/litellm/pull/2315/files |
love it, thank you! |
Wow great one waiting to test Claude-3 in my code interpreter. |
Working v0 PR is live here: 14fc835 |
@ishaan-jaff is this ready for me to stack the tool calling + image calling PRs on this? |
Yes @krrishdholakia |
@ishaan-jaff looks like there's a separate param for system prompt |
V1 needs to add:
|
oh missed this, will your v1 PR cover this @krrishdholakia ? else I can make the fix |
yea i'm fixing it for tool calling |
FYI Anthropic deprecated the following models on
Since we moved everything to /messages this means litellm will not support |
Yup - Lets put a notice on the latest release about this and see if users want us to maintain - Claude-instant-1 and Claude-2 |
@ishaan-jaff thanks! There seemed to be a bug with system prompt enabled + multiple conversations
It looks like it has some bugs, |
i got this error while testing this API. Python code: import os
from litellm import completion
from dotenv import load_dotenv
# load.env file
def init_env():
load_dotenv()
api_key = os.getenv("ANTHROPIC_API_KEY")
# set env - [OPTIONAL] replace with your anthropic key
os.environ["ANTHROPIC_API_KEY"] = api_key
if __name__ == "__main__":
init_env()
try:
model = "claude-3-sonnet-20240229"
messages = [{"role": "user", "content": "Hey! how's it going?"}]
response = completion(model=model, messages=messages)
print(response)
except Exception as exception:
import traceback as tb
print(tb.format_exc())
print(exception)
|
Python code. import os
from litellm import completion
from dotenv import load_dotenv
# load.env file
def init_env():
load_dotenv()
api_key = os.getenv("ANTHROPIC_API_KEY")
if not api_key:
raise Exception("Missing ANTHROPIC_API_KEY in.env file")
if __name__ == "__main__":
init_env()
try:
model="claude-3-sonnet-20240229",
prompt = "Write code to find prime number in given range."
messages=[
{
"role": "user",
"content": [
{
"type": "text",
"text": prompt
}
]
}
]
response = completion(model=model, messages=messages)
print(response)
except Exception as exception:
print(f"Exception occured: {exception}")
import traceback
print(traceback.format_exc()) Now i tested and got more error from LiteLLM now. (heaven-env)% python claude_demo.py
/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_max_budget" has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Exception occured: GetLLMProvider Exception - 'tuple' object has no attribute 'split'
original model: ('claude-3-sonnet-20240229',)
Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/litellm/utils.py", line 4981, in get_llm_provider
model.split("/", 1)[0] in litellm.provider_list
^^^^^^^^^^^
AttributeError: 'tuple' object has no attribute 'split'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/litellm/main.py", line 601, in completion
model, custom_llm_provider, dynamic_api_key, api_base = get_llm_provider(
^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/litellm/utils.py", line 5148, in get_llm_provider
raise litellm.exceptions.BadRequestError( # type: ignore
litellm.exceptions.BadRequestError: GetLLMProvider Exception - 'tuple' object has no attribute 'split'
original model: ('claude-3-sonnet-20240229',)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/haseeb-mir/Documents/Code/Python/code-interpreter/claude_demo.py", line 31, in <module>
response = completion(model=model, messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/litellm/utils.py", line 2621, in wrapper
raise e
File "/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/litellm/utils.py", line 2524, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/litellm/main.py", line 1908, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/litellm/utils.py", line 7767, in exception_type
raise e
File "/opt/homebrew/Caskroom/miniforge/base/envs/heaven-env/lib/python3.11/site-packages/litellm/utils.py", line 7735, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: GetLLMProvider Exception - 'tuple' object has no attribute 'split'
original model: ('claude-3-sonnet-20240229',)
|
@haseeb-heaven have you pip installed the latest version |
Yes. |
Code now working fine . import os
from litellm import completion
from dotenv import load_dotenv
import litellm
# load.env file
def init_env():
load_dotenv()
api_key = os.getenv("ANTHROPIC_API_KEY")
if not api_key:
raise Exception("Missing ANTHROPIC_API_KEY in.env file")
if __name__ == "__main__":
init_env()
litellm.set_verbose=True
try:
model="claude-3-sonnet-20240229"
prompt = "Write code to find prime number in given range."
messages=[
{
"role": "user",
"content": [
{
"type": "text",
"text": prompt
}
]
}
]
response = completion(model=model, messages=messages)
print(response)
except Exception as exception:
print(f"Exception occured: {exception}") |
If you try with system, and add a few messages it might fail? |
You guys are just awesome, can't believe a model support with vision and function calling, plus different provider (AWS) are all done in 12 hours |
Does this already work on the proxy side, or is it just me that gets stuck? |
@SamArgillander Bedrock just got merged, should be out soon. Anthropic API is already live in https://github.com/BerriAI/litellm/releases/tag/v1.29.1 What's the error? |
Also getting errors on Bedrock. It seems LiteLLM needs to be updated to call the new messages API.
|
Try this code format for Messages API. if __name__ == "__main__":
try:
model="claude-3-sonnet-20240229"
prompt = "Write code to find prime number in given range."
messages=[
{
"role": "user",
"content": [
{
"type": "text",
"text": prompt
}
]
}
]
response = completion(model=model, messages=messages)
print(response)
except Exception as exception:
print(f"Exception occured: {exception}") |
Not sure if this is the correct location to post on but when using with code interpreter and on the latest version of litellm I'm receiving messages {"type":"error","error":{"type":"invalid_request_error","message":"messages: all messages must have non-empty content except for interpreter works for the first response but once it executes the codes I'm getting the message about having blank messages being sent.
|
I am having a lot of issues as well.
the tool_calls from openai api does not work here |
I guess it should be handled using the function_results tag, as shown on the notebook https://github.com/anthropics/anthropic-cookbook/blob/main/function_calling/function_calling.ipynb:
|
@themrzmaster can you confirm which version of litellm your on? This works for me on the latest one: def test_bedrock_claude_3_tool_calling():
try:
litellm.set_verbose = True
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
},
},
"required": ["location"],
},
},
}
]
messages = [
{"role": "user", "content": "What's the weather like in Boston today?"}
]
response: ModelResponse = completion(
model="bedrock/anthropic.claude-3-sonnet-20240229-v1:0",
messages=messages,
tools=tools,
tool_choice="auto",
)
print(f"response: {response}")
# Add any assertions here to check the response
assert isinstance(response.choices[0].message.tool_calls[0].function.name, str)
assert isinstance(
response.choices[0].message.tool_calls[0].function.arguments, str
)
except RateLimitError:
pass
except Exception as e:
pytest.fail(f"Error occurred: {e}") |
hey @themrzmaster @quan2005 @shaggy2626 noting the issues:
|
I primarily use LiteLLM as a Proxy for LibreChat. I'm now getting responses from Bedrock for Claude 3, but it seems there's an error with the proxy.
|
Please ignore, it seems fine now. Might have been an Anthropic API issue. |
im not sure how to answer that as im only using it in Open Interpreter. i included the output from terminal for your review. My prompt was "what time is it". I also did see something about my api key being invalid however the key is valid and i confirmed by looking at the usage in the Claude portal. |
closing as the work for this ticket is complete |
The Feature
Motivation, pitch
New model
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: