Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'CLIPTextTransformer' object has no attribute '_build_causal_attention_mask' #12

Open
adhikjoshi opened this issue Jun 9, 2023 · 5 comments

Comments

@adhikjoshi
Copy link

Screenshot 2023-06-09 at 1 15 21 PM

White running test.ipynb file, we are running into this error.

'CLIPTextTransformer' object has no attribute '_build_causal_attention_mask'

Followed same process, as installation.

@owoshch
Copy link

owoshch commented Jun 9, 2023

Subscribing to this issue, I'm encountering the same problem

@smko77
Copy link

smko77 commented Jun 12, 2023

I'm also encountering the same problem

@owoshch
Copy link

owoshch commented Jun 14, 2023

pip install --upgrade transformers==4.25.1 did the job for me.

By default, the line from repo pip install transformers >= 4.25.1 installs transformers 4.30.2. By forcing it to be 4.25.1 I managed to run the cell below without the error.

kwargs = sampling_kwargs(prompt = prompt,
                         step = 50,
                         cfg = 5.0,
                         fusion = False,
                        )
image = pipe(ref_image_latent=gt_latents, ref_image_embed=vision_hidden_states, **kwargs).images[0]
image_pretrained_model = get_concat_h(input_img_, image)
print("Results before fine-tuning")
image_pretrained_model.show()

@jadechip
Copy link

pip install --upgrade transformers==4.25.1

This worked for me as well.

@syguan96
Copy link

syguan96 commented Nov 5, 2023

API of Transformers are changed.
This should work:

from transformers.models.clip.modeling_clip import _make_causal_mask
causal_attention_mask = _make_causal_mask((pseudo_hidden_states.shape[0],pseudo_hidden_states.shape[1]),
                                                          pseudo_hidden_states.dtype,
                                                          device=pseudo_hidden_states.device)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants