Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error p.attn_bias_ptr is not correctly aligned when testing #1

Closed
poedator opened this issue Mar 2, 2024 · 1 comment
Closed

Error p.attn_bias_ptr is not correctly aligned when testing #1

poedator opened this issue Mar 2, 2024 · 1 comment

Comments

@poedator
Copy link

poedator commented Mar 2, 2024

tried to test the code with run_A100.sh script but got this error:

$:~/sequoia/tests$ bash run_A100.sh
...
Traceback (most recent call last):
  File "/extra_disk_1/optimus/sequoia/tests/testbed.py", line 293, in <module>
    simulation_fast(target_model=target_model, draft_model=draft_model, dataloader=dataloader, T=args.T, top_p=args.P,
  File "/extra_disk_1/optimus/sequoia/tests/testbed.py", line 68, in simulation_fast
    spectree = SpecTree(prefix=input_ids.squeeze(0), device='cuda:0', temperature=T,
  File "/extra_disk_1/optimus/sequoia/tests/../Tree/SpecTree.py", line 68, in __init__
    draft_model_outputs = self.draft_model_engine.inference(input_ids=self.tokens[:self.num_nodes].unsqueeze(0), 
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/extra_disk_1/optimus/sequoia/tests/../Engine/Engine.py", line 242, in inference
    return self.engine.model_run(input_ids=input_ids, storage_ids=storage_ids,
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/extra_disk_1/optimus/sequoia/tests/../Engine/Engine.py", line 38, in model_run
    logits = self.model(input_ids=input_ids, 
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "/extra_disk_1/optimus/sequoia/tests/../Engine/Llama_model.py", line 201, in forward
    outputs = self.model(
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "/extra_disk_1/optimus/sequoia/tests/../Engine/Llama_model.py", line 59, in forward
    layer_outputs = decoder_layer(
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "/extra_disk_1/optimus/sequoia/tests/../Engine/Llama_modules.py", line 334, in forward
    hidden_states = self.self_attn(
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/optimus/conda/envs/py9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
  File "/extra_disk_1/optimus/sequoia/tests/../Engine/Llama_modules.py", line 127, in forward
    attn_output = torch.nn.functional.scaled_dot_product_attention(
RuntimeError: p.attn_bias_ptr is not correctly aligned

my lib versions:

$:~/sequoia/tests$ pip list | grep -e transformers -e torch -e accelerate
accelerate                        0.26.1
torch                             2.1.0
torchaudio                        0.13.1
torchvision                       0.14.1
transformers                      4.37.2
@poedator poedator changed the title Multiple errors in testing: p.attn_bias_ptr is not correctly aligned, Invalid pattern: '**' Error p.attn_bias_ptr is not correctly aligned when testing Mar 2, 2024
@dreaming-panda
Copy link
Contributor

Can you switch to torch 2.1.2 and transformers 4.36.2?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants