Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider using huggingface_hub library for Inference #42

Closed
Wauplin opened this issue Apr 5, 2023 · 8 comments
Closed

Consider using huggingface_hub library for Inference #42

Wauplin opened this issue Apr 5, 2023 · 8 comments
Assignees
Labels
enhancement New feature or request

Comments

@Wauplin
Copy link

Wauplin commented Apr 5, 2023

In awesome_chat.py, you implement the inference method huggingface_model_inference (that uses the Hugging Face Hub under the hood) using requests and custom code. One thing you can do to deduplicate some code is to use huggingface_hub library and in particular its InferenceAPI class. You can find a guide here on how to use it.

For example, you can replace:

if task == "text-to-image":
    text = data["text"]
    response = requests.post(task_url, headers=HUGGINGFACE_HEADERS, json={"inputs": text})
    img_data = response.content
    img = Image.open(BytesIO(img_data))

by

if task == "text-to-image":
    img = InferenceApi(repo_id=model_id, task=task, token=token)(inputs=data["text"])

Some advantages of using huggingface_hub are:

  • it checks if model and task are compatible
  • if the user is logged in using huggingface-cli login, the token is automatically retrieved
  • (limited) some data parsing when possible (image => PIL)

Disclaimer: I am a maintainer of huggingface_hub. If you need any help with the integration, I'd be happy to help! 😃

@tricktreat tricktreat self-assigned this Apr 5, 2023
@tricktreat tricktreat pinned this issue Apr 5, 2023
@tricktreat
Copy link
Collaborator

@Wauplin Thanks for your practical suggestion, we'll incorporate it into the plan!

@tricktreat tricktreat added the enhancement New feature or request label Apr 5, 2023
@tricktreat
Copy link
Collaborator

Just to be sure, not all tasks huggingface_hub are supported? @Wauplin

@Wauplin
Copy link
Author

Wauplin commented Apr 5, 2023

@tricktreat all tasks that the Inference API on the Hub support are supported by this client (it's "just" a wrapper around requests). However only a few tasks will have its output parsed correctly. Otherwise you can get the raw response from the server (see reference).

If you see have feedback on the InferenceAPI client itself, please let me know. I'd glad to improve it if needed :)

@tricktreat
Copy link
Collaborator

tricktreat commented Apr 5, 2023

@Wauplin Thanks. I simply tried the image-to-image task and got {'error': 'Task image-to-image is invalid'}. I think maybe not all tasks are supported.

import io
from diffusers.utils import load_image
from huggingface_hub.inference_api import InferenceApi

def image_to_bytes(img_url):
    img_byte = io.BytesIO()
    load_image(img_url).save(img_byte, format="jpeg")
    img_data = img_byte.getvalue()
    return img_data

inference = InferenceApi("lambdalabs/sd-image-variations-diffusers", token="****")
result = inference(data=image_to_bytes("https://raw.githubusercontent.com/justinpinkney/stable-diffusion/main/assets/im-vars-thin.jpg"))
print(result)

I also checked the code and here seems to be the currently supported tasks.

@Wauplin
Copy link
Author

Wauplin commented Apr 5, 2023

Oh right, thanks for noticing! This list is out of date then. I created an issue on huggingface_hub (huggingface/huggingface_hub#1424)

⚠️ You copy-pasted your token in the snippet above. I advice you to revoke it right now on https://huggingface.co/settings/tokens. For next tests you can use huggingface-cli login to login your machine so that you don't need to paste it in plain in your scripts.

@tricktreat
Copy link
Collaborator

Thanks for the reminder. I've edited it.

@Wauplin
Copy link
Author

Wauplin commented Apr 5, 2023

@tricktreat You really need to revoke it, i.e. create a new one. You can do that in your settings by clicking on Manage > Invalidate and refresh. You'll need to update it in the applications you have but it'll be safer this way.

2023-04-05_19-04

On Github even when you edit a comment the previous version of the text can be accessed (if you see the little arrow on the right of edited in your comment). So your token is still visible to everyone at the moment 😕

@tricktreat
Copy link
Collaborator

@Wauplin Thanks again! I was negligent about this. I have now completed the token reset according to your instructions.

tricktreat added a commit that referenced this issue Apr 5, 2023
@tricktreat tricktreat unpinned this issue Apr 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants