-
Notifications
You must be signed in to change notification settings - Fork 226
Support for Watsonx ai model for langchain extension #874
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Disclaimer: Experimental PR review
PR Summary
This pull request adds support for the Watsonx.ai model in the Langfuse Python SDK, specifically for the Langchain extension. The changes span multiple files to integrate Watsonx.ai functionality into the existing codebase.
Key changes include:
- Added Watsonx.ai model parameter handling in
langfuse/callback/langchain.py - Extended model name extraction for Watsonx.ai in
langfuse/extract_model.py - Modified usage metric conversion in
langfuse/utils/__init__.pyto support Watsonx.ai format - Updated sampling logic in
langfuse/Sampler.pyto handle Watsonx.ai event structures
Important considerations:
- Ensure comprehensive testing for Watsonx.ai integration, including edge cases
- Verify that changes in
Sampler.pydon't affect existing functionality for other models - Consider adding explicit type hints and docstring updates for new Watsonx.ai-related code
- Evaluate if any additional error handling or logging is needed for Watsonx.ai-specific scenarios
4 file(s) reviewed, no comment(s)
Edit PR Review Bot Settings
Changes for supporting Watsonx.ai (https://python.langchain.com/v0.2/docs/integrations/llms/ibm_watsonx/)