Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support other/local LLMs #74

Open
olmohake opened this issue May 13, 2024 · 2 comments
Open

Support other/local LLMs #74

olmohake opened this issue May 13, 2024 · 2 comments

Comments

@olmohake
Copy link

I am very exited by the idea of text to gql and would love to implement it for my organization. The context to send a long is pretty big though. This I’d love the option to use a local llama instance to reduce costs. Any chances to support other/local llms in the future?

@danstarns
Copy link
Member

danstarns commented Aug 28, 2024

@olmohake I shipped the adapter architecture, now you can extend the base adapter to use your own LLMs, please share more info on what other LLMs and patterns you wish to use.

  1. https://github.com/rocket-connect/gqlpt/tree/main/packages/adapter-base - Extend Adapter
  2. OpenAI adapter https://github.com/rocket-connect/gqlpt/tree/main/packages/adapter-openai
- import { AdapterOpenAI } from "@gqlpt/adapter-openai";
+ import { MyAdapter } from "my-adapter";
import { GQLPTClient } from "gqlpt";

const typeDefs = /* GraphQL */ `
  type User {
    id: ID!
    name: String!
  }

  type Query {
    user(id: ID!): User
  }
`;

const client = new GQLPTClient({
  typeDefs,
-  adapter: new AdapterOpenAI({
    apiKey: process.env.OPENAI_API_KEY,
  }),
+ adapter: new MyAdapter()
});

@danstarns
Copy link
Member

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants