generated from amazon-archives/__template_MIT-0
-
Notifications
You must be signed in to change notification settings - Fork 107
Open
Description
Is your feature request related to a problem? Please describe.
Hello, The drawback I find is that to use the models locally, you need to use another tool such as Termux. From there, you can install Ollama and then run it in Swift-Chat.
Describe the solution you'd like
would it be possible to download models from the app to run them locally? I mean, in the Ollama section, you could write the name of the model to download it from here.
Describe alternatives you've considered
Something similar to what I'm referring to is in the project here github.com/sunshine0523/OllamaServer, where it is not necessary to use Termux, you can simply do all this from the application.
Additional context
No response
Metadata
Metadata
Assignees
Labels
No labels