Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Alpaca and Llama models #8

Open
samheutmaker opened this issue Mar 25, 2023 · 5 comments
Open

Support Alpaca and Llama models #8

samheutmaker opened this issue Mar 25, 2023 · 5 comments
Labels
enhancement New feature or request help wanted Extra attention is needed High Priority A high priority issue

Comments

@samheutmaker
Copy link
Contributor

Autodoc is currently reliant on OpenAI for access to cutting-edge language models. Going forward, we would like to support models running locally or at providers other than OpenAI, like Llama, or Alpaca. This gives developers more control over how their code is indexed, and allows indexing of private code that cannot be shared with OpenAI.

This is a big undertaking that will be an on-going process. A few thoughts for someone who wants to get starting hacking on this.

  1. It would be nice to be able to configure Autodoc with a LangChain LLM via the Autodoc config file. This would allow for complete control over how an LLM is configured.
  2. It seems like a lot of people are using llamma.cpp to run llamma locally. It may be worth using this as a starting point to support other models.

This issue is high priority. If you're interesting in working on it, please reach out.

@samheutmaker samheutmaker added enhancement New feature or request help wanted Extra attention is needed High Priority A high priority issue labels Mar 25, 2023
@dahifi
Copy link

dahifi commented Mar 25, 2023

Ideally the model configuration stuff should be abstracted out to something like langchain, it's too bad there no TS port for it yet.

@samheutmaker
Copy link
Contributor Author

Yes there is, we already use it in Autodoc. https://github.com/hwchase17/langchainjs

@katopz
Copy link

katopz commented Mar 26, 2023

Maybe consider use https://github.com/rustformers/llama-rs in that case?
and also https://github.com/sobelio/llm-chain for langchain.

@quicoli
Copy link

quicoli commented Mar 27, 2023

have you seen this project? https://github.com/microsoft/semantic-kernel
It might help with this task....

Semantic Kernel (SK) is a lightweight SDK enabling integration of AI Large Language Models (LLMs) with conventional programming languages. The SK extensible programming model combines natural language semantic functions, traditional code native functions, and embeddings-based memory unlocking new potential and adding value to applications with AI.

SK supports prompt templating, function chaining, vectorized memory, and intelligent planning capabilities out of the box.

@cognitivetech
Copy link

ollama supports openai api

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed High Priority A high priority issue
Projects
None yet
Development

No branches or pull requests

5 participants