Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data #1

Closed
dd1github opened this issue Feb 20, 2022 · 1 comment
Closed

Data #1

dd1github opened this issue Feb 20, 2022 · 1 comment

Comments

@dd1github
Copy link

Hi,

Can you please provide a link to sample data. I tried to generate data with the stimuli ipynb but got the following error message.

tensor_train_ims = torch.Tensor(train_ims)/255 # transform to torch tensor

RuntimeError: [enforce fail at ..\c10\core\CPUAllocator.cpp:79] data. DefaultCPUAllocator: not enough memory: you tried to allocate 15552000000 bytes.

The other other ipynb apparently does not run without the sample data.

Thanks!

@Hosseinadeli
Copy link
Owner

Hi, that error is due to the dataset being too large for your system. Changing the flag self.trials_per_im to a smaller value should solve the issue. I changed the default value in the notebook (from 50 to 10 for the MultiMNIST dataset). Please let me know if this solves the issue. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants