Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SUGGESTION] Add an API function? #13

Open
SabinStargem opened this issue May 31, 2023 · 0 comments
Open

[SUGGESTION] Add an API function? #13

SabinStargem opened this issue May 31, 2023 · 0 comments

Comments

@SabinStargem
Copy link

This is Darth Gius's suggestion on the Reddit thread.

Sure, what I'd like to add is a completion endpoint for my node.js app, most chatbots create this making a local server and printing a link (to not start every time the chatbot) and the external app sends the prompt and receives the output through the link, like here (I think, it's the Tauri page for the api), or this (uses openai api in python to generates answers), but I'm no expert on this, for now I can/prefer to send context+prompt directly to your app and start it every time I need an answer (I already have a nodejs code to start a python chatbot and send prompts there and receive outputs in nodejs, now I would have to understand how your code works and change the python chatbot with your app).

On the rustformers github page I see that one of the commands to generate the answer is llm llama infer -m ggml-gpt4all-j-v1.3-groovy.bin -p "Rust is a cool programming language because", my basic idea for now is to change the Tauri app to let it do -p prompt, which receives from my code through the link or through a shared variable (if I don't use the link and start different times your app)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant