New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: Memory issues using map
with LazySignal
#2045
Comments
Probably the issue is more the Regarding the computation time, |
Probably it'll be a good idea to add the option not to stack the |
I've done some more tests and the behaviour is exactly the same if I would use Regarding the computation time, there is now issue at all. The large size of the results give troubles with all the following computation i.e. transpose, save or plot. |
You was right, >>> mean.original_metadata = hyperspy.misc.utils.DictionaryTreeBrowser()
>>> mean.save(dir_path+'_mean', overwrite=True)
>>> os.path.getsize(dir_path+'_mean.hspy')/2**10 # in KB
27.4404296875 Just 27 KB which is totally reasonable. |
Thanks for the feedback. We must definitely add the option of not storing |
When adding that option, we should include a similar fix for |
Fixed in #2691. |
I have tried to use
map()
applying a self-made function to a large dataset which was loaded lazily. In this example for simplicity I've replaced custom function bynp.mean()
:I was puzzled with
ragged
argument so I've tried bothragged=True
andragged=False
, it doesn't play any role here... Next I tried plottingmean.plot()
and it already took much longer than expected (though I haven't timed it properly). Next:2 GB for a one dimensional data set with 2635 'float32' is certainly to much... Also the performance suggests that it really stores 2 GB for the
mean
in memory.Any ideas?
The text was updated successfully, but these errors were encountered: