Switch from OpenAI SDK to LiteLLM for Model Completions #3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
In response to the need for a more flexible and inclusive approach to interacting with various language models, this PR introduces a significant shift from using the OpenAI SDK to employing LiteLLM. LiteLLM offers a common interface that supports a wide range of models, making it easier for users to interact with different language models without being restricted to a specific provider.
Key Changes:
plugin/chatvim.py
file, where LiteLLM'scompletion
method is now used to obtain model completions.gpt-3.5-turbo>>
) instead of a shorthand version. This enhancement, implemented in the_get_chat_history
method ofplugin/chatvim.py
, allows for greater flexibility and supports the diverse naming schemes of various language models.Readme.md
anddoc/chatvim.txt
files have been updated to reflect the new way of specifying models and the switch to LiteLLM. These updates provide clear instructions for users on how to interact with the updated system.requirements.txt
file has been modified to replace the OpenAI SDK dependency with LiteLLM, ensuring that the necessary packages are installed for the plugin to function correctly.These changes were made to enhance the plugin's flexibility and usability, allowing users to easily switch between different language models and take advantage of the unique features each model offers. By adopting LiteLLM, we are also preparing the plugin for future expansions and integrations with an ever-growing list of language models.
Closes issue #2