Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extend support to tensors #12

Open
mortendahl opened this issue Jul 2, 2019 · 2 comments
Open

Extend support to tensors #12

mortendahl opened this issue Jul 2, 2019 · 2 comments

Comments

@mortendahl
Copy link
Member

Right now we only support matrices; using eg https://eigen.tuxfamily.org/dox-devel/unsupported/eigen_tensors.html we might support more general tensors as well

@justin1121
Copy link
Member

Using Eigen::Tensor is definitely the way to go about supporting more general tensors. I wanted to understand if this would be a big change or not so I dug into it a bit. TL;DR Its a pretty big/involved change.

The main reason its a big change is that you can't dynamically specify the rank of a tensor whereas you can dynamically specify the rows/cols of a matrix. For matrices you can do:

Eigen::Matrix<float, Dynamic, Dynamic> a(5, 6);

Dynamic above means that the rows and cols can be anything and then the object is created with 5 rows and 6 columns.

With tensors the rank cannot be set to dynamic:

Eigen::Tensor<float, 4> b(4, 5, 6, 7);

4 is the rank and then the dimensions are set to 4, 5, 6 and 7. So when storing this Tensor object in a class you would have to have a member for every rank of tensor you want to support.

There's a lot to be learned from Tensorflow and how they support tensors. They use Eigen::Tensor in the background and the way they handle this is by using the Eigen::TensorMap object and just storing all the values in a buffer attached to the tensorflow::Tensor object. Eigen::TensorMap takes external buffers and then forms it into a tensor. We'll probably have to do something similar to this which is why its a bit more work.

Another route I was thinking about was adding a dtype to tensorflow for big tensors. This would be pretty cool but its not clear if this is actually possible and how involved it would be to make work.

@mortendahl
Copy link
Member Author

mortendahl commented Jul 3, 2019

There's a lot to be learned from Tensorflow and how they support tensors. They use Eigen::Tensor in the background and the way they handle this is by using the Eigen::TensorMap object and just storing all the values in a buffer attached to the tensorflow::Tensor object. Eigen::TensorMap takes external buffers and then forms it into a tensor. We'll probably have to do something similar to this which is why its a bit more work.

Excellent, thanks for the write-up @justin1121! FWIW I agree that following TF makes a lot of sense.

Another route I was thinking about was adding a dtype to tensorflow for big tensors. This would be pretty cool but its not clear if this is actually possible and how involved it would be to make work.

If this works it would be great!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants