Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when querying hugging face model #56

Closed
Shivak11 opened this issue Nov 9, 2023 · 1 comment
Closed

Error when querying hugging face model #56

Shivak11 opened this issue Nov 9, 2023 · 1 comment

Comments

@Shivak11
Copy link

Shivak11 commented Nov 9, 2023

I tried the huggingface bot script. I am encountering a 404 error. Looks likes MODAL is expecting the model to be run locally. Is that correct?

@anmolsingh95
Copy link
Contributor

I just tried the huggingface script and it seems or be working for me. We are the AsyncInferenceClient from the huggingface_hub library. According to the comments in the code:

The model to run inference with. Can be a model id hosted on the Hugging Face Hub, e.g. bigcode/starcoder or a URL to a deployed Inference Endpoint. Defaults to None, in which case a recommended model is automatically selected for the task.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants