-
Notifications
You must be signed in to change notification settings - Fork 6.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature add Add LlamaCppChatCompletionClient and llama-cpp #5326
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #5326 +/- ##
========================================
Coverage 75.59% 75.59%
========================================
Files 189 191 +2
Lines 12672 12850 +178
========================================
+ Hits 9579 9714 +135
- Misses 3093 3136 +43
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/__init__.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
Will be working on this today. |
…on and error handling; add unit tests for functionality
@ekzhu I completed the tests please have another look |
@microsoft-github-policy-service agree company="Microsoft" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the interest of smaller change set, let's focus on create
and raise NoteImplementedError
in create_stream
.
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
… improve type hints in tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! I think there are more work needed for this PR to be ready.
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Outdated
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Show resolved
Hide resolved
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Show resolved
Hide resolved
…pChatCompletionClient initialization and create methods
…agent context handling
…y; add llama_cpp model to documentation
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
Show resolved
Hide resolved
…tor LlamaCppChatCompletionClient initialization
… test_llama_cpp_model_client
This pull request introduces the integration of the
llama-cpp
library into theautogen-ext
package, with significant changes to the project dependencies and the implementation of a new chat completion client. The most important changes include updating the project dependencies, adding a new module for theLlamaCppChatCompletionClient
, and implementing the client with various functionalities.Project Dependencies:
python/packages/autogen-ext/pyproject.toml
: Addedllama-cpp-python
as a new dependency under thellama-cpp
section.New Module:
python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/__init__.py
: Introduced theLlamaCppChatCompletionClient
class and handled import errors with a descriptive message for missing dependencies.Implementation of
LlamaCppChatCompletionClient
:python/packages/autogen-ext/src/autogen_ext/models/llama_cpp/_llama_cpp_completion_client.py
:LlamaCppChatCompletionClient
class with methods to initialize the client, create chat completions, detect and execute tools, and handle streaming responses.Why are these changes needed?
Related issue number
Checks