Skip to content

ImportError: Using load_in_8bit=True requires Accelerate: pip install accelerate and the latest version of bitsandbytes pip install -i https://test.pypi.org/simple/ bitsandbytes or pip install bitsandbytes` when in reality it's a torch issue #837

Open
huggingface/transformers
#38528
@dataf3l

Description

@dataf3l

bitsandbytes reports this error:

(venv) ➜  image-captioning-v2 python captionit3.py
True
False
Traceback (most recent call last):
  File "/Users/b/study/ml/image-captioning-v2/captionit3.py", line 14, in <module>
    model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-6.7b-coco", device_map='auto', quantization_config=nf4_config)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/b/study/ml/image-captioning-v2/venv/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2616, in from_pretrained
    raise ImportError(
ImportError: Using `load_in_8bit=True` requires Accelerate: `pip install accelerate` and the latest version of bitsandbytes `pip install -i https://test.pypi.org/simple/ bitsandbytes` or pip install bitsandbytes`

however, the error is innacurate, because the issue is that the function:

def is_bitsandbytes_available():
    if not is_torch_available():
        return False

    # bitsandbytes throws an error if cuda is not available
    # let's avoid that by adding a simple check
    import torch

    return _bitsandbytes_available and torch.cuda.is_available()

and if somebody accidentally uninstall torch, this happens.
So maybe one should improve the error message.
and maybe sending a message here to the user complaining about "unable to import torch" would be useful, who knows

tell your friends! :)

Activity

Its3rr0rsWRLD

Its3rr0rsWRLD commented on Oct 29, 2023

@Its3rr0rsWRLD

Are you on MacOS? Had the same issue on it, swapped to windows (remote ssh) and searching for a different issue lol

oushu1zhangxiangxuan1

oushu1zhangxiangxuan1 commented on Oct 30, 2023

@oushu1zhangxiangxuan1

I got the same ERROR

SoyGema

SoyGema commented on Oct 30, 2023

@SoyGema

Are you on MacOS? Had the same issue on it, swapped to windows (remote ssh) and searching for a different issue lol

Yes. Same issue on MacOS

effortprogrammer

effortprogrammer commented on Nov 20, 2023

@effortprogrammer

same issue... is there any updates?

pechaut78

pechaut78 commented on Nov 27, 2023

@pechaut78

same issue

RamsesCamas

RamsesCamas commented on Dec 7, 2023

@RamsesCamas

Same issue

pechaut78

pechaut78 commented on Dec 7, 2023

@pechaut78

Well as said above, the error is not that the lib is not properly installed: the error message is misleading.
The issue is that it is not implemented on Apple Silicon (mps)
So, bitsandbytes can not be used and code should be adapted ! sigh

Please see:

#485

github-actions

github-actions commented on Dec 31, 2023

@github-actions

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

TimDettmers

TimDettmers commented on Jan 8, 2024

@TimDettmers
Collaborator

This is a great catch. Can you please submit this to the transformers github repo? This is only indirectly a bitsandbytes issue.

Titus-von-Koeller

Titus-von-Koeller commented on Jan 26, 2024

@Titus-von-Koeller
Collaborator

To me it's not entirely clear where Mac comes into play and how we would best warn that Mac is not supported.

@pechaut78 how did you deduce that it must be Mac related? And why does the code get triggered that is throwing the traceback?

younesbelkada

younesbelkada commented on Jan 29, 2024

@younesbelkada
Collaborator

Hi - the core issue is that currently in transformers is_bitsandbytes_available() silently returns False if you don't have a CUDA device, i.e. if torch.cuda.is_available(): https://github.com/huggingface/transformers/blob/cd2eb8cb2b40482ae432d97e65c5e2fa952a4f8f/src/transformers/utils/import_utils.py#L623
This is not ideal as we should display a more informative warning instead - @Titus-von-Koeller would be happy to open a quick PR on transformers to add a logger.info if torch.cuda.is_available() is False to clearly state to users that is_bitsandbytes_available() will silently be set to False ? Otherwise happy to do it as well

added theissue type on Feb 28, 2025
added
Low RiskRisk of bugs in transformers and other libraries
High RiskRisk of bugs in transformers and other libraries
on Feb 28, 2025
added and removed
Low RiskRisk of bugs in transformers and other libraries
High RiskRisk of bugs in transformers and other libraries
on Feb 28, 2025
ved1beta

ved1beta commented on Jun 2, 2025

@ved1beta
Contributor

@matthewdouglas @Titus-von-Koeller, please can you have a look here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    Cross PlatformHuggingface RelatedA bug that is likely due to the interaction between bnb and HF libs (transformers, accelerate, peft)

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      Participants

      @pechaut78@dataf3l@TimDettmers@Titus-von-Koeller@SoyGema

      Issue actions

        ImportError: Using `load_in_8bit=True` requires Accelerate: `pip install accelerate` and the latest version of bitsandbytes `pip install -i https://test.pypi.org/simple/ bitsandbytes` or pip install bitsandbytes` when in reality it's a torch issue · Issue #837 · bitsandbytes-foundation/bitsandbytes