Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory usage exceeds limit on Ubuntu20.04 #4

Closed
dwh649821599 opened this issue Mar 26, 2023 · 2 comments
Closed

Memory usage exceeds limit on Ubuntu20.04 #4

dwh649821599 opened this issue Mar 26, 2023 · 2 comments

Comments

@dwh649821599
Copy link

We found that the memory usage of the baseline is different on different OS. It exceeds the limit (about 3400MB) on Ubuntu20.04, but not on Ubuntu22.04. Additionally, we found that this seems to be caused by PyTorch.
Here is the result of memory-profiler's analysis on the forward function of the model.

Line #    Mem usage    Increment  Occurrences   Line Contents
=============================================================
    71   2240.7 MiB   2240.7 MiB           1       @profile
    72                                             def forward(self, x):
    73   2240.7 MiB      0.0 MiB           1           bsz = x.size(0)
    74   3247.5 MiB   1006.9 MiB           1           out = relu(self.bn1(self.conv1(x.view(bsz, 3, 32, 32))))
    75   3248.5 MiB      1.0 MiB           1           out = self.layer1(out)
    76   3248.7 MiB      0.1 MiB           1           out = self.layer2(out)
    77   3248.7 MiB      0.0 MiB           1           out = self.layer3(out)
    78   3248.7 MiB      0.0 MiB           1           out = self.layer4(out)
    79   3248.7 MiB      0.0 MiB           1           out = avg_pool2d(out, 4)
    80   3248.7 MiB      0.0 MiB           1           out = out.view(out.size(0), -1)
    81   3249.2 MiB      0.6 MiB           1           out = self.linear(out)
    82   3249.2 MiB      0.0 MiB           1           return out
@HamedHemati
Copy link
Collaborator

Hi @dwh649821599 . Thanks for reporting the issue. This was an unexpected behavior. It seems like the difference should be around 200-300 MBs in different environments. The max RAM limit is now increased to 4000. The should be high enough for different environments.

@dwh649821599
Copy link
Author

Thank you so much for your timely response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants