Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes #3544 based on the data-type of message #3554

Merged
merged 3 commits into from
May 13, 2024

Conversation

paneru-rajan
Copy link
Contributor

Fixes the error while ollama_chat use langfuse

The value of response_obj["choices"][0]["message"] is Message object and dict

Added a conditional to use .json only iff it is Message Object

Relevant issues

Fixes #3544

Type

🐛 Bug Fix

Testing

Error Replicates with

litellm.success_callback = ["langfuse"]
resp = completion(
    model="ollama_chat/llama3:instruct",
    messages=[
        {"role": "user", "content": "Hello"},
    ],
    base_url='http://localhost:11434'
)

print(resp["choices"][0]["message"])

After fix it will work for both: ollama_chat/llama3:instruct and ollama/llama3:instruct

The value of response_obj["choices"][0]["message"] is Message object and dict

Added a conditional to use .json only iff it is Message Object
Copy link

vercel bot commented May 10, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 10, 2024 0:45am

@paneru-rajan paneru-rajan changed the title Based on the data-type using json Fixes #3544 based on the data-type of message May 10, 2024
@krrishdholakia
Copy link
Contributor

krrishdholakia commented May 10, 2024

we should assume the response will be of type ModelResponse and it's subsequent pydantic objects - e.g. Messages. (since that's the type we enforce on .completion())

The correct fix here is to fix the ollama file to return the correct pydantic object.

@paneru-rajan let me know if you want to close this and open another pr, or rework the existing pr to fix the actual issue.

@paneru-rajan
Copy link
Contributor Author

we should assume the response will be of type ModelResponse and it's subsequent pydantic objects - e.g. Messages. (since that's the type we enforce on .completion())

The correct fix here is to fix the ollama file to return the correct pydantic object.

@paneru-rajan let me know if you want to close this and open another pr, or rework the existing pr to fix the actual issue.

Thank you, I update this PR with the proper fix, that commit was just a patch so it makes sense not to go that route.

Following statement replaces the Pydantic Message Object and initialize it with the dict
model_response["choices"][0]["message"] = response_json["message"]

We need to make sure message is always litellm.Message object

As a fix, based on the code of ollama.py file, i am updating just the content intead of entire object for both sync and async functions
@paneru-rajan
Copy link
Contributor Author

paneru-rajan commented May 10, 2024

Updated: Preserving the Pydantic Message Object

model_response["choices"][0]["message"] = response_json["message"]
This statement replaces the Pydantic Message Object and initialize it with the dict

We need to make sure message is always litellm.Message object

As a fix, based on the code of ollama.py file, i've just updated the content instead of entire object for both sync and async functions

Relevant issues

Fixes #3544

Type

🐛 Bug Fix

Testing

  • Checked completion with ollama_chat/llama3:instruct
  • Checked acompletion with ollama_chat/llama3:instruct

Looks good to me, handing over to you @krrishdholakia for review.

@Viktor2k
Copy link

@paneru-rajan, do you mind assigning @krrishdholakia as the reviewer? Would be so nice if we could get this through🤞🏼

@@ -300,7 +300,7 @@ def get_ollama_response(
model_response["choices"][0]["message"] = message
model_response["choices"][0]["finish_reason"] = "tool_calls"
else:
model_response["choices"][0]["message"] = response_json["message"]
model_response["choices"][0]["message"]["content"] = response_json["message"]["content"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this doesn't fix the problem of the message being a dict object instead of pydantic.

@paneru-rajan can you share a test you wrote to see if this fixed the problem, and that it's passing?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested, model_response["choices"][0]["message"] is a pydantic Message Object.

As before we were replacing it with dict, which is what was the issue. Since i am just assigning the content which is a string, which does not override the existing Message Object.

image

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Following image shows assert doesn't raise an exception.

image

@paneru-rajan
Copy link
Contributor Author

@paneru-rajan, do you mind assigning @krrishdholakia as the reviewer? Would be so nice if we could get this through🤞🏼

It seemed Github does not allow us to add reviewer. I tried adding @krrishdholakia as reviewer and then left the comment instead because of that.

@krrishdholakia krrishdholakia merged commit e92f433 into BerriAI:main May 13, 2024
2 checks passed
@krrishdholakia
Copy link
Contributor

LGTM! thanks for sharing the screenshot @paneru-rajan that helped

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Langfuse integration not supporting Ollama Chat API
3 participants