Python memory leak in torch.onnx.export of HF GPT2 #106976
Labels
module: memory usage
PyTorch is using more memory than it should, or it is leaking memory
module: onnx
Related to torch.onnx
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
馃悰 Describe the bug
Network memory cannot be reclaimed by python after
torch.onnx.export
of a HF transformers GPT2 model.Expected behavior:
del model
andgc.collect()
aftertorch.onnx.export
should free up all resources tied to the model (namely its weights). This is reproducible with larger 6.7B models as well wheretorch.onnx.export
leaks a more problematic 26GB.Versions
The text was updated successfully, but these errors were encountered: