-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A few bug fixes following OpenAI client update #178
A few bug fixes following OpenAI client update #178
Conversation
We had a call for openai.Models to check the OpenAI connection, but the client has changed
- Only verify OpenAI connection if OpenAI is used in configuration - Verify configuration before starting the server, to give a clear error message - Don't error out on the main `canopy` command (always show the intro \ help message)
Did not update to use the new OpenAI client
It was missing a test, appraently
You look away one sec and Co-Pilot is havocing your error messages...
Meant to add it before and forgot
|
||
def _load_kb_config(config_file: Optional[str]) -> Dict[str, Any]: | ||
config = _read_config_file(config_file) | ||
if not config: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't it a duplicate?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nope, config
can return empty ({}
) from _read_config_file()
.
@@ -132,6 +136,22 @@ def _load_kb_config(config_file: Optional[str]) -> Dict[str, Any]: | |||
return kb_config | |||
|
|||
|
|||
def _validate_chat_engine(config_file: Optional[str]): | |||
config = _read_config_file(config_file) | |||
Tokenizer.initialize() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should have a "with tokenizer..."
A nice feature
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, thought about it.
Doesn't worth the effort right now...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just task, i will add
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lg
Problem
After OpenAI client was updated (#171), a few problems arose:
canopy
CLI command (without subcommand) always errored outOpenAILLM.avilable_models
was brokenSolution
There was a minor bug in the CLI that still used the old
openai
syntax.But there was a deeper issue: Since we'll be adding support for more LLMs soon, it doesn't make sense that the CLI verifies the OpenAI connection at the top level. Changes:
OpenAI
component is in the configuration.canopy start
, before running the server itself. This allows the CLI to catch the actual error, and present it to the user in a controlled manner (instead of the error being raise inside theUVicorn
process itself, which we can't catch).OpenAILLM.availble_models
and add a UTType of Change
Test Plan
OpenAILLM