Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local Image Generation #5

Open
issacdowling opened this issue May 13, 2023 · 10 comments
Open

Local Image Generation #5

issacdowling opened this issue May 13, 2023 · 10 comments

Comments

@issacdowling
Copy link

Is your feature request related to a problem? Please describe.
I wish I could generate images without needing an internet connection or a separate service.

Describe the solution you'd like
I'd like to be able to run local models (since Stable Diffusion is open source anyway) using this app.

Describe alternatives you've considered
There are Web UIs (like InvokeAI), but I much prefer this native desktop app to something that runs in a browser.

Additional context
It would be nice if I could use my GPU to locally generate images, instead of relying on external services.

@0xMRTT
Copy link
Contributor

0xMRTT commented May 13, 2023

Yes, that's on the roadmap (i started working on it) and for the same feature on Bavarder

@issacdowling
Copy link
Author

Cool, I was just about to make the same issue on Bavarder. Right now I'm relying on distrobox to give me a separate environment that's good for these tasks, so having a flatpak app that does the same would be really nice.

I have 2 more questions then:

  1. How far along is this?
  2. Are there also plans with local models to let you change resolution, iterations, etc?

@0xMRTT
Copy link
Contributor

0xMRTT commented May 13, 2023

Since the App has been designed to be expandable and is based on a plugin system, I can add as many preferences for each provider as I want. The issue with local models is that I need to add a way to download it

@0xMRTT
Copy link
Contributor

0xMRTT commented May 21, 2023

@issacdowling Local Models will be available in the next release of Bavarder and will be available in Imaginer soon.

You can see the documentation here

@issacdowling
Copy link
Author

Is there any chance that these projects would ever get the ability to actually run these models within the flatpak? As in, bundling something like llama.cpp for Bavarder, and whatever the tool is for stable diffusion for imaginer, rather than connecting to an API for something else running locally?

Fair if not, as I can see how it would add lots of complexity, but it would mean getting set up with real local models would be way cleaner

@0xMRTT
Copy link
Contributor

0xMRTT commented May 22, 2023

The issue is that if I bundle for example llama.cpp inside the flatpak, the flatpak will be way bigger for a feature that not everyone use...

@tio-trom
Copy link

Speaking of that, what does it connect to now? A company that mines data?

@0xMRTT
Copy link
Contributor

0xMRTT commented Jul 13, 2023

Hugging Face

@tio-trom
Copy link

Hugging Face

Thanks! I wonder if they collect data and what do they collect via this Imaniger app.

@0xMRTT
Copy link
Contributor

0xMRTT commented Jul 13, 2023

Hugging Face

Thanks! I wonder if they collect data and what do they collect via this Imaniger app.

https://huggingface.co/privacy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants