Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use SecretStorage #5

Merged
merged 10 commits into from
Jun 12, 2023
Merged

Use SecretStorage #5

merged 10 commits into from
Jun 12, 2023

Conversation

nvms
Copy link
Owner

@nvms nvms commented Jun 11, 2023

RE: #4

One thing to consider is that when using LLaMa for generation, which uses the OpenAI API, the user will be prompted for an OpenAI API key and they may not have one. Still undecided on how best to handle this.

Do we need to expose a way via the UI to destroy these secrets?

@nvms nvms marked this pull request as draft June 11, 2023 15:16
@capdevc
Copy link
Contributor

capdevc commented Jun 11, 2023

I think there should be a command to trigger manually for setting API tokens, maybe with a dropbox for provider selection. With that I don't think deletion is strictly necessary. You could just overwrite the secret with an empty string or whatever you like.

Re: locally hosted, you could only use the API key when the apiURL isn't the default one. For Anthropic this is hard coded in the SDK. I think the way you have it the key is only prompted for at time of use, so that would solve that problem. Is there a scenario where you want an openai api key to use with a non-openai API endpoint?

You could also decouple the API from the provider

"wingman.providers": [
  {  
    "name": "openai",
    "apiType": "openai",
    "apiKeySecret": "openai",
  },
  {  
    "name": "localLLaMa",
    "apiType": "openai",
    "apiUrl": "https://localhost/...",
  },
  {  
    "name": "anthropic",
    "apiType": "anthropic",
    "apiKeySecret": "anthropic",
  },
]

Here the provider gets looked up by name (given in the command definition), the apiUrl (optional) overrides the default URL for that API, and the apiKeySecret is the name of the secret for that provider, and no API key is needed if no secret is named.

This may be a little complicated, but defaults could be added for anthropic and openai so you'd only need to interact with it if you wanted to add a non-standard provider for a given API. Presumably if you want to do that you could figure this out.

@nvms
Copy link
Owner Author

nvms commented Jun 12, 2023

I think there should be a command to trigger manually for setting API tokens, maybe with a dropbox for provider selection. With that I don't think deletion is strictly necessary. You could just overwrite the secret with an empty string or whatever you like.

I like this idea a lot and implemented it here: 7b6160e

Is there a scenario where you want an openai api key to use with a non-openai API endpoint?

No, I don't think so

A thought: when you have an invalid OpenAI API key, you will see this warning:

Screenshot 2023-06-11 at 9 17 57 PM

Since creating a ChatGPTAPI requires an API key to be provided (https://github.com/transitive-bullshit/chatgpt-api/blob/main/src/chatgpt-api.ts#L102-L104), I wonder if it'd be reasonable to just assign a default value of, e.g. "llama". This means that the local llama solution would "just work" without any extra configuration (other than changing wingman.openai.apiBaseUrl), and a person trying to use ChatGPT will see the above warning and be prompted to set/change their key.

But...

Decoupling the provider from the API style is a cleaner solution, though, and I'll likely take this PR in that direction. Another benefit of decoupling is that it allows for defining other third party solutions that might share an API style of either OpenAI or Anthropic.

@capdevc
Copy link
Contributor

capdevc commented Jun 12, 2023

I'd suggest a FAQ or something for "How do I delete my token?" since overwriting it may not be obvious to a user.

The anthropic API also has error codes for bad tokens etc: https://docs.anthropic.com/claude/reference/errors-and-rate-limits so you could prompt on that specific response as well.

I do think that decoupling providers from the API is the path forward since it seems possible that a standard will evolve here.

@nvms
Copy link
Owner Author

nvms commented Jun 12, 2023

I think the scope of this PR has been met. Probably best to save the decoupling provider stuff for another PR after giving it a bit more thought about how best to implement it.

  • SecretStorage is being used to store API keys.
  • FAQ has been added to the README.

Llama support remains, but it's not a great user experience and will be greatly improved by the decoupling as mentioned before.

@nvms nvms marked this pull request as ready for review June 12, 2023 22:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants