Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature request : Docker image for deepspeed-mii #83

Open
Thytu opened this issue Oct 29, 2022 · 3 comments
Open

feature request : Docker image for deepspeed-mii #83

Thytu opened this issue Oct 29, 2022 · 3 comments

Comments

@Thytu
Copy link
Contributor

Thytu commented Oct 29, 2022

Motivation :

As a developper I want to easily be able to test deepspeed-mii.
However, while using conda (or other python package manager i.e pypenv), I still encounter error (with protobuf for example).

Solution :

Fastest one : Provide a Dockefile that the developer/user could build to use and test deepspeed-mii
What would be amazing : At each deepspeed-mii modification, a CI build the docker image and upload/update it on the dockerhub.

This should take long to do but would be great to have 🙂

@mrwyattii
Copy link
Contributor

I think this a great idea. We have something similar right now via the DeploymentType.AML pathway, however the Docker image we build there is intended for AzureML. We may be able to adapt the work we have done there to create a Docker image for easier deployment.

In the meantime, could you please share what errors you are running into with installing deepspeed-mii? is the error related to protobuf versioning? We may have a bug in our requirements or packaging that should be resolved.

@Thytu
Copy link
Contributor Author

Thytu commented Nov 2, 2022

I think this a great idea. We have something similar right now via the DeploymentType.AML pathway, however the Docker image we build there is intended for AzureML. We may be able to adapt the work we have done there to create a Docker image for easier deployment.

Nice! If I can help in anyway I would be happy to! This would be useful for me to have a proper image to deploy 🙂

In the meantime, could you please share what errors you are running into with installing deepspeed-mii? is the error related to protobuf versioning? We may have a bug in our requirements or packaging that should be resolved

Yes, here are the steps followed with there respective outputs:

- conda create -n mii

- conda activate mii

- conda install pip

- pip install deepspeed-mii

- python main.py:
File "/home/vdmatos/miniconda3/envs/mii-2/lib/python3.10/site-packages/mii/models/providers/diffusers.py", line 6, in diffusers_provider
    from diffusers import DiffusionPipeline
ModuleNotFoundError: No module named 'diffusers'

- pip install diffusers

- python main.py:
File "/home/vdmatos/miniconda3/envs/mii-2/lib/python3.10/site-packages/deepspeed/ops/transformer/inference/attention.py", line 22, in load_triton_flash_attn
    raise ImportError("Please install triton 2.0+ or `pip install deepspeed[sd]`")

- pip install deepspeed[sd]

- python main.py:
File "/home/vdmatos/miniconda3/envs/mii-2/lib/python3.10/site-packages/deepspeed/inference/engine.py", line 238, in _validate_args
    raise ValueError(f"model must be a torch.nn.Module, got {type(self.module)}")
ValueError: model must be a torch.nn.Module, got <class 'diffusers.pipelines.stable_diffusion.pipeline_stable_diffusion.StableDiffusionPipeline'>

(still by running this example)

Feel free to let me know if :

  • I did any mistake in my installation step
  • You want me to open a proper issue relating this error (rather than following it on this feature request)

@mrwyattii
Copy link
Contributor

@Thytu there were some recent updates to MII and DeepSpeed with regards to Stable Diffusion. Your installation methods look fine, could you try running again with latest of each?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants