Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

oobabooga on another server - how to setup? #3

Closed
LightTemplar opened this issue Jul 25, 2023 · 7 comments
Closed

oobabooga on another server - how to setup? #3

LightTemplar opened this issue Jul 25, 2023 · 7 comments

Comments

@LightTemplar
Copy link

How to setup, if I have oobabooga installed on another machine?

@if-ai
Copy link
Owner

if-ai commented Jul 26, 2023

I will add a way to dynamically set it on the settings but for now the only way is by manually change the ip on the scrips/if_prompt_mkr.py
msedge_R225pCQNjW

@if-ai
Copy link
Owner

if-ai commented Jul 27, 2023

I added the feature to use another ip check the video later is uploading now

@LightTemplar
Copy link
Author

From the logs looks like connection is working! Though, I still didn't get that beautiful result from your images :)

@if-ai
Copy link
Owner

if-ai commented Jul 28, 2023

what is the prompt you get? did you get the prompt, could you take a screenshot of the image + prompt?

@LightTemplar
Copy link
Author

LightTemplar commented Aug 8, 2023

Thanks to your video, I got it working =)
I only had to use TheBloke/WizardLM-7B-V1-0-Uncensored-SuperHOT-8K-GGML with Transformers because I ain't got enough vram for 13B GPTQ model with Exllama.
My result for the promt "(CatGirl warrior:1.2), legendary flower" was:
00127-2811832964

@if-ai
Copy link
Owner

if-ai commented Aug 8, 2023 via email

@LightTemplar
Copy link
Author

I understand! They are two VMs on two PCs in local network.
For oobabooga I got 5GB VRAM (Quadro P2200), and for AUTO1111 - 12 GB VRAM (RTX 3060)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants