-
Notifications
You must be signed in to change notification settings - Fork 193
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I/O operations similar to Numpy #354
Comments
We can discuss it, I am interested in the use cases. :) |
@josevalim It is common in the ML community to save numpy arrays to disk so they can used at a later time. And from my research the functionality fit the following use cases:
In essence, these example use cases capture the main idea: Saving Tensors to file so they can be loaded up quickly and reused later. It would also be interesting to see how these saved tensors could be used in a distributed system. Perhaps saving the file to one or multiple nodes. Or loading very large tensors saved in binary compressed format on various nodes to be processed. |
It is common to save to |
Right! For NN though, they have their own formats, so I am wondering if most tools will have their own formats in top, meaning ours won’t be used much. :) |
Numpy has it's own We could add support for reading This might be interesting for python developers who have workflows that use This would essentially be like reading another format like But yeah, most tools have their own specialized binary format to save large arrays. |
Hello wizards. I created the original Npy module to take the tensor processed by python's tensorflow and use it in the tensorflow-lite module on Nerves. The initial specification was to load/save my own %Npy{} into a npy file. If you are interested, please visit my github: |
@shoz-f Nice! I will take a look at your implementation. It would be nice for this to be part of Nx itsel in my opinion. Tell me from your experience, how common is this feature in a ML researcher or practitioners workflow? |
Are there any temporary workarounds for this? I don't intend to use the model elsewhere, just within Axon. But I'm having trouble thinking of an approach that allows me to reuse a trained model later on, or in a different environment/application instance. Like, training locally, and using the model in a production environment. Or maybe, just to apply the model in a different instance. |
@imsekun if production and dev have the same endianess, then it is a matter of :erlang.term_to_binary to serialize it and write the result to a file. Then File.read! and :erlang.binary_to_term to read it. |
@josevalim Oh wow TIL that existed. Awesome. Thanks so much! |
We currently have from_numpy and from_numpy_archive, which are Numpy specific. I will open up a new discussion about file storage for Nx. |
Will Nx support I/O operations similar to those found in Numpy's npyio?
Functions like:
if so, will these functions be packaged into their own module or called directly from Nx?
The text was updated successfully, but these errors were encountered: