Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Expected x1.dtype() == cos.dtype() to be true, but got false. (Could this error message be improved? If so, please report an enhancement request to PyTorch.) #472

Open
JerryDaHeLian opened this issue Nov 17, 2023 · 1 comment

Comments

@JerryDaHeLian
Copy link

When I Pre-train LLaMA, there is a error:
File "/usr/local/lib/python3.8/dist-packages/torch/autograd/function.py", line 551, in apply
return super().apply(*args, **kwargs) # type: ignore[misc]
File "/home/xxx/TinyLlama/lit_gpt/fused_rotary_embedding.py", line 39, in forward
rotary_emb.apply_rotary(
RuntimeError: Expected x1.dtype() == cos.dtype() to be true, but got false. (Could this error message be improved? If so, please report an enhancement request to PyTorch.)

Who can help me? 3q!

@haiduo
Copy link

haiduo commented Jan 13, 2024

I have solved it! Look at follows:
The bug stem form

dtype=idx.dtype,

I managed to change it :
from transformers.utils import is_torch_bf16_gpu_available
dtype=torch.bfloat16 if is_torch_bf16_gpu_available() else torch.float16,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants