Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

So Mac can not use this? #39

Open
iwoomi opened this issue Apr 5, 2023 · 42 comments
Open

So Mac can not use this? #39

iwoomi opened this issue Apr 5, 2023 · 42 comments

Comments

@iwoomi
Copy link

iwoomi commented Apr 5, 2023

Macs are not using NVIDIA display card, so Mac can not use this right?

@ekiwi111
Copy link

ekiwi111 commented Apr 5, 2023

I guess you still can, but using hybrid mode only. https://github.com/microsoft/JARVIS#configuration

@xiebruce
Copy link

xiebruce commented Apr 5, 2023

I guess you still can, but using hybrid mode only. https://github.com/microsoft/JARVIS#configuration

But the server needs Nvidia display card.
image

image

@Fermain
Copy link

Fermain commented Apr 5, 2023

There are various ways to configure this package depending on your resource limitations. I am using it on a Mac right now:

image

@ethanye77
Copy link

There are various ways to configure this package depending on your resource limitations. I am using it on a Mac right now:

image

WX25660405-090513@2x
I am also a Mac user and I encountered this issue while running this line of code. Could you please tell me what I should do if it is convenient?

@ethanye77
Copy link

here‘s my issue
image

@Fermain
Copy link

Fermain commented Apr 5, 2023

The answer to your issue is on line 3 of your screenshot. Install git-lfs and try the model download step again.

@ethanye77
Copy link

The answer to your issue is on line 3 of your screenshot. Install git-lfs and try the model download step again.

image

Thank you. Your solution is very helpful, but after downloading so many files, the progress is still 0%. Is this a normal situation?

@ErikDombi
Copy link
Contributor

Yes, the LFS objects are rather large. My models folder is 275 GB personally.

@davidjhwu
Copy link

Are the LFS objects absolutely necessary?
Tryna run this on my macbook air lol (16gb ram, 500gb ssd)

@Fermain
Copy link

Fermain commented Apr 6, 2023

Are the LFS objects absolutely necessary? Tryna run this on my macbook air lol (16gb ram, 500gb ssd)

No, you can run the lite.yaml configuration to use remote models only, although this is quite limited at the moment. I suggest using an external hard drive or SSD to manage these large models.

@iwoomi
Copy link
Author

iwoomi commented Apr 6, 2023

@Fermain So If we deploy JARVIS in macOS, we can only use the lite.yaml(that is inference_mode: huggingface) right? Because if we use inference_mode:local(or inference_mode:hybrid),we should have a Nvidia display card, but Macs have no Nvidia display card, is that right?

@sirlaurie
Copy link

sirlaurie commented Apr 6, 2023

@Fermain So If we deploy JARVIS in macOS, we can only use the lite.yaml(that is inference_mode: huggingface) right? Because if we use inference_mode:local(or inference_mode:hybrid),we should have a Nvidia display card, but Macs have no Nvidia display card, is that right?

comment line 298-300(maybe if you didn't reformat this file) in models_server.py file

"midas-control": {sometmodel here}

you can run without nvidia device.

@van0303
Copy link

van0303 commented Apr 7, 2023

I have just downloaded the models on my Mac, I don't have the N display card.
And i have started with this models_server.py --config lite.yaml
I got the error messages :
AssertionError: Torch not compiled with CUDA enabled

comment the

"midas-control": {
            "model": MidasDetector(model_path=f"{local_fold}/lllyasviel/ControlNet/annotator/ckpts/dpt_hybrid-midas-501f0c75.pt")
            }

the models_server started

@sirlaurie
Copy link

did you run git lfs install?

@xmagicwu
Copy link

xmagicwu commented Apr 7, 2023

yes,git lfs installed

@xmagicwu
Copy link

xmagicwu commented Apr 7, 2023

the version is 3.3.0

@sirlaurie
Copy link

I mean after you installed git-lfs, you need run git lfs install first

if you did it already, run sh download.sh again

@xmagicwu
Copy link

xmagicwu commented Apr 7, 2023

Thanks, I'll try it

@xiebruce
Copy link

xiebruce commented Apr 7, 2023

There are various ways to configure this package depending on your resource limitations. I am using it on a Mac right now:

image

@Fermain @ethanye77 Did you encountered this error: #67

Solving environment: failed with initial frozen solve. Retrying with flexible solve.

@liujie316316
Copy link

Thanks, I'll try it

Hello, have you resolved this issue? I also reported the same error.
image

I executed the following command but still reported an error:
pip install git-lfs
cd models
sh download.sh

@Fermain
Copy link

Fermain commented Apr 8, 2023

I executed the following command but still reported an error: pip install git-lfs cd models sh download.sh

git-lfs is not a pip package. You can use homebrew to install it:

brew install git-lfs

The error message states that this is not installed.

@liujie316316
Copy link

I executed the following command but still reported an error: pip install git-lfs cd models sh download.sh

git-lfs is not a pip package. You can use homebrew to install it:

brew install git-lfs

The error message states that this is not installed.

OK,Thank you!

@xmagicwu
Copy link

image

My device is a mackbook M1, how to solve this problem?

@Fermain
Copy link

Fermain commented Apr 10, 2023

Without Nvidia hardware, there is no solution to this particular issue. This system is not designed to run on Apple hardware and can only be used in limited ways on this platform.

@xmagicwu
Copy link

How to use it restrictively?

@Fermain
Copy link

Fermain commented Apr 10, 2023

The readme contains instructions for using the model with the lite.yaml config file instead of the full config.yaml file. Add your API keys to this lite file, and run this instead of config.

@sirlaurie
Copy link

sirlaurie commented Apr 10, 2023

image My device is a mackbook M1, how to solve this problem?

checkout my first post in this issue:

#39 (comment)

you don't need to change config.yaml to lite.yaml

@xmagicwu
Copy link

图像 我的设备是mackbook M1,如何解决这个问题?

查看我在本期中的第一篇文章:

#39(评论)

你不需要config.yaml改成lite.yaml

image

Did it work successfully?

@sirlaurie
Copy link

it did

@Fermain
Copy link

Fermain commented Apr 10, 2023

@sirlaurie I missed that comment, very helpful - thanks

@iwoomi
Copy link
Author

iwoomi commented Apr 10, 2023

Without Nvidia hardware, there is no solution to this particular issue. This system is not designed to run on Apple hardware and can only be used in limited ways on this platform.

@sirlaurie @Fermain I notice that we can config the device to "cuda" or "cpu" in here

device: cuda:0 # cuda:id or cpu

Do it mean that if I set the device to "cpu", then I can run the server on inference_mode =local on Mac, no matter M1/M2 chip(new Mac) or Intel cpu(old Mac) ?

@xmagicwu
Copy link

xmagicwu commented Apr 10, 2023

@Fermain So If we deploy JARVIS in macOS, we can only use the lite.yaml(that is inference_mode: huggingface) right? Because if we use inference_mode:local(or inference_mode:hybrid),we should have a Nvidia display card, but Macs have no Nvidia display card, is that right?

comment line 298-300(maybe if you didn't reformat this file) in models_server.py file

"midas-control": {sometmodel here}

you can run without nvidia device.

very helpful - thanks

But encountered another problem~
My hugginggpt not work~
image

@sirlaurie
Copy link

Without Nvidia hardware, there is no solution to this particular issue. This system is not designed to run on Apple hardware and can only be used in limited ways on this platform.

@sirlaurie @Fermain I notice that we can config the device to "cuda" or "cpu" in here

device: cuda:0 # cuda:id or cpu

Do it mean that if I set the device to "cpu", then I can run the server on inference_mode =local on Mac, no matter M1/M2 chip(new Mac) or Intel cpu(old Mac) ?

looks like it's a newly added option, but unfortunately, still no

@sirlaurie
Copy link

@Fermain So If we deploy JARVIS in macOS, we can only use the lite.yaml(that is inference_mode: huggingface) right? Because if we use inference_mode:local(or inference_mode:hybrid),we should have a Nvidia display card, but Macs have no Nvidia display card, is that right?

comment line 298-300(maybe if you didn't reformat this file) in models_server.py file
"midas-control": {sometmodel here}
you can run without nvidia device.

very helpful - thanks

But encountered another problem~ My hugginggpt not work~ image

check your network or your api quota

@xmagicwu
Copy link

thanks

image

How can the generated pictures be accessed?

@xiebruce
Copy link

thanks

image

How can the generated pictures be accessed?

This is a bug, you should create "images" and "audios" folder under /path/to/JARVIS/server/public/, theoretically, the program should create these two folder automatically, but it didn't ,so this is a bug!

@xmagicwu
Copy link

thanks
image
How can the generated pictures be accessed?

This is a bug, you should create "images" and "audios" folder under /path/to/JARVIS/server/public/, theoretically, the program should create these two folder automatically, but it didn't ,so this is a bug!

image

folder has been created

@xmagicwu
Copy link

image

image

Why does the path for generating images keep changing?

@xmagicwu
Copy link

image

what's wrong?

@iwoomi
Copy link
Author

iwoomi commented Apr 11, 2023

image

what's wrong?

Weird, it's should not be like this, please backup you lite.yaml, and force update to the latest commit and try again.

@sirlaurie
Copy link

I think the latest commit has fixed this bug. just pull again

@hx9111
Copy link

hx9111 commented Apr 12, 2023

following command as recommended to use mps(m1, m2 ,max )

conda install pytorch torchvision torchaudio -c pytorch-nightly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests