Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tt_lib.tensor.bmm and matmul - inconsistent documentation #8342

Closed
nemanjagrujic opened this issue May 10, 2024 · 1 comment
Closed

tt_lib.tensor.bmm and matmul - inconsistent documentation #8342

nemanjagrujic opened this issue May 10, 2024 · 1 comment
Assignees
Labels

Comments

@nemanjagrujic
Copy link
Contributor

nemanjagrujic commented May 10, 2024

For tt_lib.tensor.bmm and tt_lib.tensor.matmul ops:

The documentation does not state that operands need to be on device. But when operation is run with inputs in system memory we get error:

input_tensor_a.storage_type() == StorageType::DEVICE and input_tensor_b.storage_type() == StorageType::DEVICE
info:
Operands to matmul need to be on device!
@nemanjagrujic nemanjagrujic added bug Something isn't working op_cat: mm labels May 10, 2024
@nemanjagrujic nemanjagrujic changed the title tt_lib.tensor.bmm - inconsistent documentation tt_lib.tensor.bmm and matmul - inconsistent documentation May 10, 2024
@bbradelTT bbradelTT self-assigned this Jul 10, 2024
@bbradelTT
Copy link
Contributor

tt_lib bmm/matmul no longer exist and everything in python goes through ttnn matmul/linear. I updated the documentation for the latter to indicate that the tensors need to be on the device.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants