Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A few bug fixes following OpenAI client update #178

Merged

Conversation

igiloh-pinecone
Copy link
Collaborator

Problem

After OpenAI client was updated (#171), a few problems arose:

  • Running the canopy CLI command (without subcommand) always errored out
  • OpenAILLM.avilable_models was broken

Solution

There was a minor bug in the CLI that still used the old openai syntax.
But there was a deeper issue: Since we'll be adding support for more LLMs soon, it doesn't make sense that the CLI verifies the OpenAI connection at the top level. Changes:

  • Each CLI sub-command now verifies it own dependencies (e.g. connection with Pinecone)
  • Connection to OpenAI is only checked if an OpenAI component is in the configuration.
  • I added a dedicated config verification step in canopy start, before running the server itself. This allows the CLI to catch the actual error, and present it to the user in a controlled manner (instead of the error being raise inside the UVicorn process itself, which we can't catch).
  • Fixed the bug in OpenAILLM.availble_models and add a UT

Type of Change

  • Bug fix (non-breaking change which fixes an issue)

Test Plan

  • Add missing test case to OpenAILLM
  • We're still missing a proper CLI test, which we should be adding ASAP

We had a call for openai.Models to check the OpenAI connection, but the client has changed
- Only verify OpenAI connection if OpenAI is used in configuration
- Verify configuration before starting the server, to give a clear error message
- Don't error out on the main `canopy` command (always show the intro \ help message)
Did not update to use the new OpenAI client
It was missing a test, appraently
You look away one sec and Co-Pilot is havocing your error messages...
Meant to add it before and forgot

def _load_kb_config(config_file: Optional[str]) -> Dict[str, Any]:
config = _read_config_file(config_file)
if not config:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't it a duplicate?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope, config can return empty ({}) from _read_config_file().

@@ -132,6 +136,22 @@ def _load_kb_config(config_file: Optional[str]) -> Dict[str, Any]:
return kb_config


def _validate_chat_engine(config_file: Optional[str]):
config = _read_config_file(config_file)
Tokenizer.initialize()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should have a "with tokenizer..."
A nice feature

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, thought about it.
Doesn't worth the effort right now...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just task, i will add

Copy link
Contributor

@miararoy miararoy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lg

@igiloh-pinecone igiloh-pinecone added this pull request to the merge queue Nov 16, 2023
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Nov 16, 2023
@igiloh-pinecone igiloh-pinecone added this pull request to the merge queue Nov 16, 2023
auto-merge was automatically disabled November 16, 2023 14:22

Merge queue setting changed

@igiloh-pinecone igiloh-pinecone added this pull request to the merge queue Nov 16, 2023
Merged via the queue into pinecone-io:main with commit 0698ff2 Nov 16, 2023
10 checks passed
@igiloh-pinecone igiloh-pinecone deleted the bugfix/cli_openai_error branch November 16, 2023 14:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants