Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mini-batch train #9

Closed
thgngu opened this issue Aug 2, 2018 · 3 comments
Closed

Mini-batch train #9

thgngu opened this issue Aug 2, 2018 · 3 comments
Labels
needs rework Issue partially fixed, and need a clean and definitive solution

Comments

@thgngu
Copy link

thgngu commented Aug 2, 2018

Hi,

I would like suggest to implement mini-batch training. Specifically, I tried to run FSGNN on my data and got the following error

not enough memory: you tried to allocate 160465GB. Buy new RAM! at /opt/conda/conda-bld/pytorch_1524584710464/work/aten/src/TH/THGeneral.c:218

I looked into the code but unfortunately I'm not familiar with GAN to modify the code.

@diviyank
Copy link
Collaborator

diviyank commented Aug 3, 2018 via email

@diviyank diviyank added the needs rework Issue partially fixed, and need a clean and definitive solution label Aug 5, 2018
@diviyank
Copy link
Collaborator

I'm sorry that it took so long, but it should be done. It revealed itself to be trickier than expected. Now instead of feeding raw data as a numpy array or Tensor, you can feed torch.utils.data.Dataset types, with a custom loading function of elements that you write accordingly to your data.

@diviyank
Copy link
Collaborator

diviyank commented Jul 8, 2019

I will be closing this issue, as it should be solved. Don't hesitate to reopen it if the bug still persists in the latest version.
Best,
Diviyan

@diviyank diviyank closed this as completed Jul 8, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs rework Issue partially fixed, and need a clean and definitive solution
Projects
None yet
Development

No branches or pull requests

2 participants