The latest version of transformers is extremely noisy, and there seems to be no reason or benefit for any of it.
>>> from transformers import AutoModel
>>> bert_model = AutoModel.from_pretrained('hfl/chinese-electra-180g-large-discriminator')
Loading weights: 100%|████████████████████████████████████████████| 389/389 [00:00<00:00, 5460.73it/s, Materializing param=encoder.layer.23.output.dense.weight]
ElectraModel LOAD REPORT from: hfl/chinese-electra-180g-large-discriminator
Key | Status | |
--------------------------------------------------+------------+--+-
electra.embeddings.position_ids | UNEXPECTED | |
discriminator_predictions.dense_prediction.bias | UNEXPECTED | |
discriminator_predictions.dense.weight | UNEXPECTED | |
discriminator_predictions.dense_prediction.weight | UNEXPECTED | |
discriminator_predictions.dense.bias | UNEXPECTED | |
Notes:
- UNEXPECTED :can be ignored when loading from different task/architecture; not ok if you expect identical arch.
>>>
Exception in Thread-auto_conversion:
Traceback (most recent call last):
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/huggingface_hub/utils/_http.py", line 720, in hf_raise_for_status
response.raise_for_status()
~~~~~~~~~~~~~~~~~~~~~~~~~^^
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/httpx/_models.py", line 829, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '403 Forbidden' for url 'https://huggingface.co/api/models/hfl/chinese-electra-180g-large-discriminator/discussions?p=0'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/nlp/scr/horatio/miniconda3/lib/python3.13/threading.py", line 1043, in _bootstrap_inner
self.run()
~~~~~~~~^^
File "/nlp/scr/horatio/miniconda3/lib/python3.13/threading.py", line 994, in run
self._target(*self._args, **self._kwargs)
~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/transformers/safetensors_conversion.py", line 117, in auto_conversion
raise e
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/transformers/safetensors_conversion.py", line 96, in auto_conversion
sha = get_conversion_pr_reference(api, pretrained_model_name_or_path, **cached_file_kwargs)
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/transformers/safetensors_conversion.py", line 69, in get_conversion_pr_reference
pr = previous_pr(api, model_id, pr_title, token=token)
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/transformers/safetensors_conversion.py", line 14, in previous_pr
for discussion in get_repo_discussions(repo_id=model_id, token=token):
~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/huggingface_hub/hf_api.py", line 6455, in get_repo_discussions
discussions, has_next = _fetch_discussion_page(page_index=page_index)
~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/huggingface_hub/hf_api.py", line 6444, in _fetch_discussion_page
hf_raise_for_status(resp)
~~~~~~~~~~~~~~~~~~~^^^^^^
File "/nlp/scr/horatio/miniconda3/lib/python3.13/site-packages/huggingface_hub/utils/_http.py", line 802, in hf_raise_for_status
raise _format(HfHubHTTPError, message, response) from e
huggingface_hub.errors.HfHubHTTPError: (Request ID: Root=1-69a62017-4041fa5968ff8ee13d64fe3d;47725ac8-64d6-4947-ad16-4f88ef89391c)
403 Forbidden: Discussions are disabled for this repo.
Cannot access content at: https://huggingface.co/api/models/hfl/chinese-electra-180g-large-discriminator/discussions?p=0.
Make sure your token has the correct permissions.
>>>
KeyboardInterrupt
>>> from transformers import AutoModel
>>> bert_model = AutoModel.from_pretrained('hfl/chinese-electra-180g-large-discriminator')
>>>
System Info
python: 3.13.5
torch: 2.7.1+cu118
transformers: 5.2.0
tokenizers: 0.22.2
Who can help?
@ArthurZucker @Cyrilvallez
Information
Tasks
examplesfolder (such as GLUE/SQuAD, ...)Reproduction
The latest version of transformers is extremely noisy, and there seems to be no reason or benefit for any of it.
Expected behavior
Expected behavior would be nice and quiet!