Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support passing model_kwargs to pipeline #85

Open
lukealexmiller opened this issue May 10, 2023 · 1 comment
Open

Support passing model_kwargs to pipeline #85

lukealexmiller opened this issue May 10, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@lukealexmiller
Copy link

I'm trying to deploy BLIP-2 (specifically Salesforce/blip2-opt-2.7b) to a Sagemaker (SM) endpoint, but coming up against some problems.

We can deploy this model by tar'ing the model artifacts as model.tar.gz and hosting on S3, but creating a ~9GB tar file is time-consuming and leads to slow deployment feedback loops.

Alternatively, the toolkit has experimental support for downloading models from 🤗Hub on start, which is a more time/space efficient.
However, this functionality only supports passing HF_TASK and HF_MODEL_ID as env vars. In order to run inference on this model using GPU's available on SM (T4/A10) we need to pass additional model_kwargs as:

pipe = pipeline(model="Salesforce/blip2-opt-2.7b", model_kwargs={"load_in_8bit": True})

A potential solution to this would be:
On line 104 of handler_service.py the ability to pass kwargs has not been implemented, but the function get_pipeline allows for kwargs.

@philschmid
Copy link
Collaborator

Hello @lukealexmiller,

Thank you for opening the request. It is a good idea to think about adding "HF_KWARGS" as parameter.
In the meantime you can enable this by creating a custom inference.py. See here for an example: https://www.philschmid.de/custom-inference-huggingface-sagemaker

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants