Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local LLM support via Ollama #53

Closed
wants to merge 17 commits into from
Closed

Local LLM support via Ollama #53

wants to merge 17 commits into from

Conversation

snibo13
Copy link

@snibo13 snibo13 commented Jan 2, 2024

I set up a system using Llama 2 via ollama and modified the settings page to allow you to change between Ollama and OpenAI and select the model type you want to use if you're running locally. I'm running on a Windows PC so I haven't been able to validate it on MacOS but I imagine everything should work.

One note is it isn't setup to work with LLMindex as it stands. There is an API endpoint for ollama to generate embeddings for it might be possible to use a different vector database system and explicitly compute the embeddings.

image

Signed-off-by: Sidney Nimako <snibo13@gmail.com>
Signed-off-by: Sidney Nimako <snibo13@gmail.com>
Signed-off-by: Sidney Nimako <snibo13@gmail.com>
Signed-off-by: Sidney Nimako <snibo13@gmail.com>
Signed-off-by: Sidney Nimako <snibo13@gmail.com>
Signed-off-by: Sidney Nimako <snibo13@gmail.com>
@snibo13
Copy link
Author

snibo13 commented Jan 2, 2024

#6

@UdaraJay
Copy link
Owner

UdaraJay commented Jan 3, 2024

this looks great @snibo13! I'll give it a test and let you know if it's ready.

are you running ollama via wsl on windows?

@snibo13
Copy link
Author

snibo13 commented Jan 3, 2024

Yeah running with WSL

@UdaraJay UdaraJay changed the title Ollama Local LLM support via Ollama Jan 4, 2024
@UdaraJay
Copy link
Owner

UdaraJay commented Jan 4, 2024

made a couple changes, going to use this for a bit to see if there's anything else to address, but it's looking good!

  • persist provider, model and prompt in pile config vs keychain
  • separated providers and models
  • made prompt editable
  • updated UI
pile-ai.mov

@leodknuth
Copy link

cool thing. how to? could you offer a tutorial for this? thanks.

@0xJeu
Copy link

0xJeu commented Jan 14, 2024

Any traction on this?

@0xJeu
Copy link

0xJeu commented Jan 28, 2024

Ollama finally produced libraries for implementing their API functions within apps, I hope this is useful. btw something bout the dm license is causing errors.

"Cannot find module 'dmg-license" is the error message

https://github.com/ollama/ollama-js

@0xdhrv
Copy link

0xdhrv commented Jul 11, 2024

nudge — gentle

I know everyone might be busy with personal matters, but could you please check if there has been any progress on this?

@httpslinus httpslinus mentioned this pull request Jul 18, 2024
@UdaraJay
Copy link
Owner

local AI support with ollama shipped in v0.9.8.

@UdaraJay UdaraJay closed this Jul 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants