You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 7, 2024. It is now read-only.
Currently, we have hard coded dependency for tensorflow, but this need not be the case. Many external researchers use JAX, Pytorch, and even raw numpy and might not feel comfortable switching over to tensorflow just to experiment with tensor networks.
Our goal is to get more people using TensorNetworks for ML and physics research and to have these computations run on accelerated hardware. Tensorflow is one option to achieve this, but maybe not the best for some researchers.
The only things we use from tensorflow are tensordot, svd, reshape, transpose and array slicing, all of which exist in JAX/Pytorch/numpy.
I purpose we do something similar to what Keras does and allow users to hotswap these backends without affecting the public API. (We've already designed the API very well to abstract as much tensorflow away as possible).
This can be done with a Backend class that can be inherited to make TensorflowBackend, JaxBackend etc. We would only need to support the functions we actually use, which is quiet small.
A user could set the backend one of two ways. Either by doing tensornetwork.set_backend("tensorflow") for global backends or with tensornetwork.TensorNetwork(backend="tensorflow") for network specific backends (which would be very useful for benchmarking!)
In side of TensorNetwork we would just do self.backend.tensordot instead of tf.tensordot.
Overall, not a lot of work for possibly increasing our user base substantially.
The text was updated successfully, but these errors were encountered:
I think this is a fine idea --- we're not necessarily wedded to TF.
How do we handle requirements/installation? If possible, I'd prefer having a default installation that's the same as what currently exists, but with options for more sophisticated users who want to play around with different backends. Total beginners shouldn't have to make any choices.
I agree, let's keep the default install as easy as possible for new users. It should just work out of the box. I think pip install tensornetwork should include tensorflow and numpy dependencies and have a separate pip install tensornetwork-api that doesn't have any direct dependencies.
Currently, we have hard coded dependency for tensorflow, but this need not be the case. Many external researchers use JAX, Pytorch, and even raw numpy and might not feel comfortable switching over to tensorflow just to experiment with tensor networks.
Our goal is to get more people using TensorNetworks for ML and physics research and to have these computations run on accelerated hardware. Tensorflow is one option to achieve this, but maybe not the best for some researchers.
The only things we use from tensorflow are
tensordot
,svd
,reshape
,transpose
and array slicing, all of which exist in JAX/Pytorch/numpy.I purpose we do something similar to what Keras does and allow users to hotswap these backends without affecting the public API. (We've already designed the API very well to abstract as much tensorflow away as possible).
This can be done with a
Backend
class that can be inherited to makeTensorflowBackend
,JaxBackend
etc. We would only need to support the functions we actually use, which is quiet small.A user could set the backend one of two ways. Either by doing
tensornetwork.set_backend("tensorflow")
for global backends or withtensornetwork.TensorNetwork(backend="tensorflow")
for network specific backends (which would be very useful for benchmarking!)In side of
TensorNetwork
we would just doself.backend.tensordot
instead oftf.tensordot
.Overall, not a lot of work for possibly increasing our user base substantially.
The text was updated successfully, but these errors were encountered: