Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot use HuggingFace models #516

Open
derritter88 opened this issue Mar 31, 2024 · 3 comments
Open

Cannot use HuggingFace models #516

derritter88 opened this issue Mar 31, 2024 · 3 comments

Comments

@derritter88
Copy link

Expected Behavior

Download a model from Hugging Face and use it.

Current Behavior

I do get following error message: offload_weight() takes from 3 to 4 positional arguments but 5 were given within browser.
Console output see below.

Steps to Reproduce

Please provide detailed steps to reproduce the issue.

  1. Install hugging-face binding
  2. Restart LoLLMS
  3. Download a model like https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B
  4. Click on the model to enable it and error.

Console log

Requested updating of setting model_name to Nous-Hermes-2-Mistral-7B-DPO
Changing model to: Nous-Hermes-2-Mistral-7B-DPO
Building model
Nous-Hermes-2-Mistral-7B-DPO
*-*-*-*-*-*-*-*
Cuda VRAM usage
*-*-*-*-*-*-*-*
{'nb_gpus': 1, 'gpu_0_total_vram': 12878610432, 'gpu_0_used_vram': 2595225600, 'gpu_0_model': 'NVIDIA GeForce RTX 4070 Ti'}
Cleared cache
*-*-*-*-*-*-*-*
Cuda VRAM usage
*-*-*-*-*-*-*-*
{'nb_gpus': 1, 'gpu_0_total_vram': 12878610432, 'gpu_0_used_vram': 2595225600, 'gpu_0_model': 'NVIDIA GeForce RTX 4070 Ti'}
Creating tokenizer C:\Users\mmuehlbacher\lollms\personal_data\models\transformers\Nous-Hermes-2-Mistral-7B-DPO
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Recovering generation config C:\Users\mmuehlbacher\lollms\personal_data\models\transformers\Nous-Hermes-2-Mistral-7B-DPO
Creating model C:\Users\mmuehlbacher\lollms\personal_data\models\transformers\Nous-Hermes-2-Mistral-7B-DPO
Using device map: auto
Loading checkpoint shards:  33%|█████████████████████████████████████████████████████████▋                                                                                                                   | 1/3 [00:06<00:13,  6.74s/it]
Traceback (most recent call last):
  File "C:\Users\mmuehlbacher\lollms\lollms-webui\zoos\bindings_zoo\hugging_face\__init__.py", line 266, in build_model
    self.model:AutoModelForCausalLM = AutoModelForCausalLM.from_pretrained(str(model_path),
                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\mmuehlbacher\lollms\installer_files\lollms_env\Lib\site-packages\transformers\models\auto\auto_factory.py", line 561, in from_pretrained
    return model_class.from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\mmuehlbacher\lollms\installer_files\lollms_env\Lib\site-packages\transformers\modeling_utils.py", line 3502, in from_pretrained
    ) = cls._load_pretrained_model(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\mmuehlbacher\lollms\installer_files\lollms_env\Lib\site-packages\transformers\modeling_utils.py", line 3926, in _load_pretrained_model
    new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
                                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\mmuehlbacher\lollms\installer_files\lollms_env\Lib\site-packages\transformers\modeling_utils.py", line 798, in _load_state_dict_into_meta_model
    state_dict_index = offload_weight(param, param_name, model, state_dict_folder, state_dict_index)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: offload_weight() takes from 3 to 4 positional arguments but 5 were given


Couldn't load the model C:\Users\mmuehlbacher\lollms\personal_data\models\transformers\Nous-Hermes-2-Mistral-7B-DPO
Here is the error encountered during loading:
offload_weight() takes from 3 to 4 positional arguments but 5 were given
@derritter88
Copy link
Author

Additional information: I am using the 9.4 version

@derritter88
Copy link
Author

The issue occurs at Windows and within WSL.

@Rainmanqxy
Copy link

Exactly the same issue. I tried different models downloaded from huggingface (incl. gguf), none of them seemed to work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants