New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
memory leak when using pytorch dataloader #746
Comments
as mentioned in the documentation, use |
It works well when I use |
I feel puzzled about the last reply:"It works well when I use tqdm(enumerate(x))", according to another reply before:"use enumerate(tqdm(x)) instead of tqdm(enumerate(x))". I am wondering if Mr.techkang wanted to say: "It works well when I use enumerate(tqdm(x))". |
Also I should mention |
Memory leak seems to be related to the way how tqdm is wrapping data_loader. More info at: tqdm/tqdm#746
When I use tqdm with pytorch, I found a memory leak.
This code will use more and more memory until break down.
Comment code
dummy = tqdm(total=100)
in main function or setnum_workers=0
will work well.I used PyTorch 1.0.1 and tqdm 4.31.1.
The text was updated successfully, but these errors were encountered: