Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ponder: Tensor type? #18

Closed
migueldeicaza opened this issue Oct 31, 2018 · 3 comments
Closed

Ponder: Tensor type? #18

migueldeicaza opened this issue Oct 31, 2018 · 3 comments

Comments

@migueldeicaza
Copy link
Contributor

Perhaps there should be a TorchSharp.Tensor type that proxies to the right storage tensor, with an abstract interface, so that people can write generic-ish code that deals with Tensors, rather than having different data types.

Bonus points - even better would be to have the Tensor not surface a Tensor<T>, as that would defeat the reusability at that point. This would have the downside that operations would have to dynamically check for type compatibility.

migueldeicaza pushed a commit that referenced this issue Jun 24, 2019
Add the ability to specify loss functions as delegates
@dsyme
Copy link
Contributor

dsyme commented May 13, 2020

We can close this, as there is a TorchTensor type?

@interesaaat
Copy link
Contributor

I think that this issue was about adding generics to the Tensor type. I tried and failed miserably. I think we can close this if we think that generics are not necessary.

@migueldeicaza
Copy link
Contributor Author

The work on @Partydonk could have helped.

shaltielshmid added a commit to shaltielshmid/TorchSharp that referenced this issue Mar 13, 2024
* Reviewed all modified modules and continued, made adjustments and continued with rehaul

* Added linear properties

* Fixed int/long in linear
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants