-
Notifications
You must be signed in to change notification settings - Fork 1k
Description
Support Private Models
In general, the feature you want added should be supported by HuggingFace's transformers library:
- If requesting a model, it must be listed here.
- If requesting a pipeline, it must be listed here.
- If requesting a task, it must be listed here.
With the python libraries, you can download models and tokenizers from the hub by either providing an auth token or being logged in via huggingface-cli login --token=.....
Ideally transformers.js would enable the same support when a user is locally logged in and the environment has the auth token in scope.
I've tried this just now to access meta-llama/Llama-2-13b-hf which requires a license agreement and can only be downloaded with a token.
Reason for request
Why is it important that we add this feature? What is your intended use case? Remember, we are more likely to add support for models/pipelines/tasks that are popular (e.g., many downloads), or contain functionality that does not exist (e.g., new input type).
This puts model access on part with the python libraries so that private models can be accessed.
Additional context
Add any other context or screenshots about the feature request here.