-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check speed of reading large number of events #75
Comments
Neither of those file locations exist anymore. I can time this if you let me know where else we have similar data. |
Ask Makoto. |
The functional vision dataset has been moved to \data\common\mobi\Experiments\FunctionalVision\p10_Ozgurs3rdFunction |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I have a 40-min datasets in which there are around 10,000 events. I notice that Matlab takes take to save and load this big structure. Looks like it is not due to the data size but an issue with Matlab structure... I had that feeling when I tried to save the structure separately. To save such a big structure, it takes a few minutes; if you have 20 datasets, it will take more than an hour just to load the data.
Here, you can find some datasets with a few thousands of events.
/data/projects/Greg/audioTraining/p05_backprojected/
Here, you can find some datasets a few tens of thousands of events.
/data/mobi/FunctionalVision/ARVO2015Paper/p07_Ozgurs3rdFunction/
Any one of the files would do. Please load and save them while measuring time. It should take a few minutes.
The text was updated successfully, but these errors were encountered: