Releases: huggingface/huggingface_hub
v0.2.1: Patch release
This is a patch release fixing an issue with the notebook login.
5e2da9b#diff-fb1696cbcf008dd89dde5e8c1da9d4be5a8f7d809bc32f07d4453caba40df15f
v0.2.0: Access tokens, skip large files, local files only
Access tokens
Version v0.2.0 introduces the access token compatibility with the hub. It offers the access tokens as the main login handler, with the possibility to still login with username/password when doing [Ctrl/CMD]+C on the login prompt:
The notebook login is adapted to work with the access tokens.
Skipping large files
The Repository
class now has an additional parameter, skip_lfs_files
, which allows cloning the repository while skipping the large file download.
Local files only for snapshot_download
The snapshot_download
method can now take local_files_only
as a parameter to enable leveraging previously downloaded files.
v0.1.2: Patch release
What's Changed
- clean_ok should be True by default by @LysandreJik in #462
Full Changelog: v0.1.1...v0.1.2
v0.1.1: Patch release
What's Changed
- Fix typing-extensions minimum version by @lhoestq in #453
- Fix argument order in
create_repo
forRepository.clone_from
by @sgugger in #459
Full Changelog: v0.1.0...v0.1.1
v0.1.0: Optional token, `HfApi` begone, git prune
What's Changed
Version v0.1.0 is the first minor release of the huggingface_hub
package, which promises better stability for the incoming versions. This update comes with big quality of life improvements.
Make token optional in all HfApi methods. by @sgugger in #379
Previously, most methods of the HfApi
class required the token to be explicitly passed. This is changed in this version, where it defaults to the token stored in the cache. This results in a re-ordering of arguments, but backward compatibility is preserved in most cases. Where it is not preserved, an explicit error is thrown.
Root methods instead of HfApi
by @LysandreJik in #388
The HfApi
class now exposes its methods through the hf_api
file, reducing the friction to access these helpers. See the example below:
# Previously
from huggingface_hub import HfApi
api = HfApi()
user = api.whoami()
# Now
from huggingface_hub.hf_api import whoami
user = whoami()
The HfApi
can still be imported and works as before for backward compatibility.
Add list_repo_files
util by @sgugger in #395
Offers a list_repo_files
to ... list the repo files! Supports both model repositories and dataset repositories
Add helper to generate an eval result model-index
, with proper typing by @julien-c in #382
Offers a metadata_eval_result
in order to generate a YAML block to put in model cards according to evaluation results.
Add metrics to API by @mariosasko in #429
Adds a list_metrics method to HfApi!
Git prune by @LysandreJik in #450
Adds a git_prune
method to the Repository
class. This prunes local files which are unneeded as already pushed to a remote.
It adds the argument auto_lfs_prune
to git_push
and the commit
context-manager for simpler handling.
Bug fixes
- Fix HfApi.create_repo when repo_type is 'space' by @nateraw in #394
- Last fixes for
datasets
'push_to_hub
method by @LysandreJik in #415
Full Changelog: v0.0.19...v0.1.0
v0.0.18: Repo metadata, git tags, Keras mixin
v0.0.18: Repo metadata, git tags, Keras mixin
Repository metadata (@julien-c)
The version v0.0.18 of the huggingface_hub
includes tools to manage repository metadata. The following example reads metadata from a repository:
from huggingface_hub import Repository
repo = Repository("xxx", clone_from="yyy")
data = repo.repocard_metadata_load()
The following example completes that metadata before writing it to the repository locally.
data["license"] = "apache-2.0"
repo.repocard_metadata_save(data)
Git tags (@AngledLuffa)
Tag management is now available! Add, check, delete tags locally or remotely directly from the Repository
utility.
- Tags #323 (@AngledLuffa)
Revisited Keras support (@nateraw)
The Keras mixin has been revisited:
- It now saves models as
SavedModel
objects rather than.h5
files. - It now offers methods that can be leveraged simply as a functional API, instead of having to use the Mixin as an actual mixin.
Improvements and bug fixes
v0.0.17: Non-blocking git push, notebook login
v0.0.17: Non-blocking git push, notebook login
Non-blocking git-push
The pushing methods now have access to a blocking
boolean parameter to indicate whether the push should happen
asynchronously.
In order to see if the push has finished or its status code (to spot a failure), one should use the command_queue
property on the Repository
object.
For example:
from huggingface_hub import Repository
repo = Repository("<local_folder>", clone_from="<user>/<model_name>")
with repo.commit("Commit message", blocking=False):
# Save data
last_command = repo.command_queue[-1]
# Status of the push command
last_command.status
# Will return the status code
# -> -1 will indicate the push is still ongoing
# -> 0 will indicate the push has completed successfully
# -> non-zero code indicates the error code if there was an error
# if there was an error, the stderr may be inspected
last_command.stderr
# Whether the command finished or if it is still ongoing
last_command.is_done
# Whether the command errored-out.
last_command.failed
When using blocking=False
, the commands will be tracked and your script will exit only when all pushes are done, even
if other errors happen in your script (a failed push counts as done).
- Non blocking git push #315 (@LysandreJik)
Notebook login (@sgugger)
The huggingface_hub
library now has a notebook_login
method which can be used to login on notebooks with no access to the shell. In a notebook, login with the following:
from huggingface_hub import notebook_login
notebook_login()
Improvements and bugfixes
- added option to create private repo #319 (@philschmid)
- display git push warnings #326 (@elishowk)
- Allow specifying data with the Inference API wrapper #271 (@osanseviero)
- Add auth to snapshot download #340 (@lewtun)
v0.0.16: Progress bars, git credentials
v0.0.16: Progress bars, git credentials
The huggingface_hub
version v0.0.16 introduces several quality of life improvements.
Progress bars in Repository
Progress bars are now visible with many git operations, such as pulling, cloning and pushing:
>>> from huggingface_hub import Repository
>>> repo = Repository("local_folder", clone_from="huggingface/CodeBERTa-small-v1")
Cloning https://huggingface.co/huggingface/CodeBERTa-small-v1 into local empty directory.
Download file pytorch_model.bin: 45%|████████████████████████████▋ | 144M/321M [00:13<00:12, 14.7MB/s]
Download file flax_model.msgpack: 42%|██████████████████████████▌ | 134M/319M [00:13<00:13, 14.4MB/s]
Branching support
There is now branching support in Repository
. This will clone the xxx
repository and checkout the new-branch
revision. If it is an existing branch on the remote, it will checkout that branch. If it is another revision, such as a commit or a tag, it will also checkout that revision.
If the revision does not exist, it will create a branch from the latest commit on the main
branch.
>>> from huggingface_hub import Repository
>>> repo = Repository("local", clone_from="xxx", revision="new-branch")
Once the repository is instantiated, it is possible to manually checkout revisions using the git_checkout
method. If the revision already exists:
>>> repo.git_checkout("main")
If a branch should be created from the current head in the case that it does not exist:
>>> repo.git_checkout("brand-new-branch", create_branch_ok=True)
Revision `brand-new-branch` does not exist. Created and checked out branch `brand-new-branch`
Finally, the commit
context manager has a new branch
parameter to specify to which branch the utility should push:
>>> with repo.commit("New commit on branch brand-new-branch", branch="brand-new-branch"):
... # Save any file or model here, it will be committed to that branch.
... torch.save(model.state_dict())
Git credentials
The login system has been redesigned to leverage git-credential
instead of a token-based authentication system. It leverages the git-credential store
helper. If you're unaware of what this is, you may see the following when logging in with huggingface_hub
:
_| _| _| _| _|_|_| _|_|_| _|_|_| _| _| _|_|_| _|_|_|_| _|_| _|_|_| _|_|_|_|
_| _| _| _| _| _| _| _|_| _| _| _| _| _| _| _|
_|_|_|_| _| _| _| _|_| _| _|_| _| _| _| _| _| _|_| _|_|_| _|_|_|_| _| _|_|_|
_| _| _| _| _| _| _| _| _| _| _|_| _| _| _| _| _| _| _|
_| _| _|_| _|_|_| _|_|_| _|_|_| _| _| _|_|_| _| _| _| _|_|_| _|_|_|_|
Username:
Password:
Login successful
Your token has been saved to /root/.huggingface/token
Authenticated through git-crendential store but this isn't the helper defined on your machine.
You will have to re-authenticate when pushing to the Hugging Face Hub. Run the following command in your terminal to set it as the default
git config --global credential.helper store
Running the command git config --global credential.helper store
will set this as the default way to handle credentials for git authentication. All repositories instantiated with the Repository
utility will have this helper set by default, so no action is required from your part when leveraging it.
Improved logging
The logging system is now similar to the existing logging system in transformers
and datasets
, based on a logging
module that controls the entire library's logging level:
>>> from huggingface_hub import logging
>>> logging.set_verbosity_error()
>>> logging.set_verbosity_info()
Bug fixes and improvements
- Add documentation to GitHub and the Hub docs about the Inference client wrapper #253 (@osanseviero)
- Have large files enabled by default when using
Repository
#219 (@LysandreJik) - Clarify/specify/document model card metadata,
model-index
, and pipeline/task types #265 (@julien-c) - [model_card][metadata] Actually, lets make dataset.name required #267 (@julien-c)
- Progress bars #261 (@LysandreJik)
- Add keras mixin #230 (@nateraw)
- Open source code related to the repo type (tag icon, display order, snippets) #273 (@osanseviero)
- Branch push to hub #276 (@LysandreJik)
- Git credentials #277 (@LysandreJik)
- Push to hub/commit with branches #282 (@LysandreJik)
- Better logging #262 (@LysandreJik)
- Remove custom language pack behavior #291 (@LysandreJik)
- Update Hub and huggingface_hub docs #293 (@osanseviero)
- Adding a handler #292 (@LysandreJik)
v0.0.15
v0.0.15: Documentation, bug fixes and misc improvements
Improvements and bugfixes
- [Docs] Update link to Gradio documentation #206 (@abidlabs)
- Fix title typo (Cliet -> Client) #207 (@cakiki)
- add _from_pretrained hook #159 (@nateraw)
- Add
filename
option tolfs_track
#212 (@LysandreJik) - Repository fixes #213 (@LysandreJik)
- Repository documentation #214 (@LysandreJik)
- Add datasets filtering and sorting #194 (@lhoestq)
- doc: sync github to spaces #221 (@borisdayma)
- added batch transform documentation & model archive documentation #224 (@philschmid)
- Sync with hf internal #228 (@mishig25)
- Adding batching support for superb #215 (@Narsil)
- Adding SD for superb (speech-classification). #225 (@Narsil)
- Use Hugging Face fork for s3prl #229 (@lewtun)
- Mv
interfaces
->widgets/lib/interfaces
#227 (@mishig25) - Tweak to prevent accidental sharing of token #226 (@julien-c)
- Fix CLI-based repo creation #234 (@osanseviero)
- Add proxify util function #235 (@mishig25)
v0.0.14: LFS Auto tracking, `dataset_info` and `list_datasets`, documentation
v0.0.14: LFS Auto tracking, dataset_info
and list_datasets
, documentation
Datasets
Datasets repositories get better support, by first enabling full usage of the Repository
class for datasets repositories:
from huggingface_hub import Repository
repo = Repository("local_directory", clone_from="<user>/<model_id>", repo_type="dataset")
Datasets can now be retrieved from the Python runtime using the list_datasets
method from the HfApi
class:
from huggingface_hub import HfApi
api = HfApi()
datasets = api.list_datasets()
len(datasets)
# 1048 publicly available dataset repositories at the time of writing
Information can be retrieved on specific datasets using the dataset_info
method from the HfApi
class:
from huggingface_hub import HfApi
api = HfApi()
api.dataset_info("squad")
# DatasetInfo: {
# id: squad
# lastModified: 2021-07-07T13:18:53.595Z
# tags: ['pretty_name:SQuAD', 'annotations_creators:crowdsourced', 'language_creators:crowdsourced', 'language_creators:found',
# [...]
- Add dataset_info and list_datasets #164 (@lhoestq)
- Enable dataset repositories #151 (@LysandreJik)
Inference API wrapper client
Version v0.0.14 introduces a wrapper client for the Inference API. No need to use custom-made requests
anymore. See below for an example.
from huggingface_hub import InferenceApi
api = InferenceApi("bert-base-uncased")
api(inputs="The [MASK] is great")
# [
# {'sequence': 'the music is great', 'score': 0.03599703311920166, 'token': 2189, 'token_str': 'music'},
# {'sequence': 'the price is great', 'score': 0.02146693877875805, 'token': 3976, 'token_str': 'price'},
# {'sequence': 'the money is great', 'score': 0.01866752654314041, 'token': 2769, 'token_str': 'money'},
# {'sequence': 'the fun is great', 'score': 0.01654735580086708, 'token': 4569, 'token_str': 'fun'},
# {'sequence': 'the effect is great', 'score': 0.015102624893188477, 'token': 3466, 'token_str': 'effect'}
# ]
- Inference API wrapper client #65 (@osanseviero)
Auto-track with LFS
Version v0.0.14 introduces an auto-tracking mechanism with git-lfs for large files. Files that are larger than 10MB can be automatically tracked by using the auto_track_large_files
method:
from huggingface_hub import Repository
repo = Repository("local_directory", clone_from="<user>/<model_id>")
# save large files in `local_directory`
repo.git_add()
repo.auto_track_large_files()
repo.git_commit("Add large files")
repo.git_push()
# No push rejected error anymore!
It is automatically used when leveraging the commit
context manager:
from huggingface_hub import Repository
repo = Repository("local_directory", clone_from="<user>/<model_id>")
with repo.commit("Add large files"):
# add large files
# No push rejected error anymore!
- Auto track with LFS #177 (@LysandreJik)
Documentation
- Update docs structure #145 (@Pierrci)
- Update links to docs #147 (@LysandreJik)
- Add new repo guide #153 (@osanseviero)
- Add documentation for endpoints #155 (@osanseviero)
- Document hf.co webhook publicly #156 (@julien-c)
- docs: ✏️ mention the Training metrics tab #193 (@severo)
- doc for Spaces #189 (@julien-c)
Breaking changes
Reminder: the huggingface_hub
library follows semantic versioning and is undergoing active development. While the first major version is not out (v1.0.0), you should expect breaking changes and we strongly recommend pinning the library to a specific version.
Two breaking changes are introduced with version v0.0.14.
The whoami
return changes from a tuple to a dictionary
- Allow obtaining Inference API tokens with whoami #157 (@osanseviero)
The whoami
method changes its returned value from a tuple of (<user>, [<organisations>])
to a dictionary containing a lot more information:
In versions v0.0.13 and below, here was the behavior of the whoami
method from the HfApi
class:
from huggingface_hub import HfFolder, HfApi
api = HfApi()
api.whoami(HfFolder.get_token())
# ('<user>', ['<org_0>', '<org_1>'])
In version v0.0.14, this is updated to the following:
from huggingface_hub import HfFolder, HfApi
api = HfApi()
api.whoami(HfFolder.get_token())
# {
# 'type': str,
# 'name': str,
# 'fullname': str,
# 'email': str,
# 'emailVerified': bool,
# 'apiToken': str,
# `plan': str,
# 'avatarUrl': str,
# 'orgs': List[str]
# }
The Repository
's use_auth_token
initialization parameter now defaults to True
.
The use_auth_token
initialization parameter of the Repository
class now defaults to True
. The behavior is unchanged if users are not logged in, at which point Repository
remains agnostic to the huggingface_hub
.
- Set use_auth_token to True by default #204 (@LysandreJik)
Improvements and bugfixes
- Add sklearn code snippet #133 (@osanseviero)
- Allow passing only model ID to clone when authenticated #150 (@LysandreJik)
- More robust endpoint with toggled staging endpoint #148 (@LysandreJik)
- Add config to list_models #152 (@osanseviero)
- Fix audio-to-audio widget and add icon #142 (@osanseviero)
- Upgrade spaCy to api 0.0.12 and remove allowlist #161 (@osanseviero)
- docs: fix webhook response format #162 (@severo)
- Update link in README.md #163 (@nateraw)
- Revert "docs: fix webhook response format (#162)" #165 (@severo)
- Add Keras docker image #117 (@osanseviero)
- Allow multiple models when testing a pipeline #124 (@osanseviero)
- scikit rebased #170 (@Narsil)
- Upgrading community frameworks to
audio-to-audio
. #94 (@Narsil) - Add sagemaker docs #173 (@philschmid)
- Add Structured Data Classification as task #172 (@osanseviero)
- Fixing keras outputs (widgets was ignoring because of type mismatch, now testing for it) #176 (@Narsil)
- Updating spacy. #179 (@Narsil)
- Create initial superb docker image structure #181 (@osanseviero)
- Upgrading asteroid image. #175 (@Narsil)
- Removing tests on huggingface_hub for unrelated changes in api-inference-community #180 (@Narsil)
- Fixing audio-to-audio validation. #184 (@Narsil)
rmdir api-inference-community/src/sentence-transformers
#188 (@Pierrci)- Allow generic inference for ASR for superb #185 (@osanseviero)
- Add timestamp to snapshot download tests #201 (@LysandreJik)
- No need for token to understand HF urls #203 (@LysandreJik)
- Remove
--no_renames
argument to list deleted files. #205 (@LysandreJik)