Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conditionally loaded modules #40

Open
dltn opened this issue Jun 26, 2023 · 2 comments
Open

Conditionally loaded modules #40

dltn opened this issue Jun 26, 2023 · 2 comments

Comments

@dltn
Copy link
Contributor

dltn commented Jun 26, 2023

Having access to all of the best APIs in one CLI is awesome! 馃殌 I've been thinking of how we could overcome the downsides:

  • 2 seconds from run to API from all of the imports
  • If I just want a CLI to OpenAI, the setup of llama.cpp is a bit overkill
  • Each module will introduce idiosyncrasies like google-generativeai mandating Python 3.9+ (Recent commit broke the gpt usage (for me)聽#29)

What if we conditionally load modules, based on the presence of OPENAI_API_KEY or config file?

@kharvd
Copy link
Owner

kharvd commented Jun 27, 2023

That sounds very reasonable, wanna try making a PR?

@kharvd
Copy link
Owner

kharvd commented Jul 9, 2023

Heads up I made Llama optional here: #46

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants