Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to release gpu memory after session.run #20517

Open
ZTurboX opened this issue Apr 30, 2024 · 2 comments
Open

how to release gpu memory after session.run #20517

ZTurboX opened this issue Apr 30, 2024 · 2 comments

Comments

@ZTurboX
Copy link

ZTurboX commented Apr 30, 2024

Describe the issue

how to release gpu memory after session.run

To reproduce

how to release gpu memory after session.run

Urgency

No response

Platform

Linux

OS Version

ubuntu

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.17.1

ONNX Runtime API

Python

Architecture

X86

Execution Provider

CUDA

Execution Provider Library Version

cuda 11.7

@github-actions github-actions bot added the ep:CUDA issues related to the CUDA execution provider label Apr 30, 2024
@sophies927 sophies927 removed the ep:CUDA issues related to the CUDA execution provider label May 2, 2024
@pranavsharma
Copy link
Contributor

Just let the session destruct if you don't intend to use the session anymore. All memory is released to the GPU when the session's destructor is called. If you intend to use the session again, you can configure this run option:

static const char* const kOrtRunOptionsConfigEnableMemoryArenaShrinkage = "memory.enable_memory_arena_shrinkage";

@ZTurboX
Copy link
Author

ZTurboX commented May 10, 2024

Just let the session destruct if you don't intend to use the session anymore. All memory is released to the GPU when the session's destructor is called. If you intend to use the session again, you can configure this run option:

how to use python method

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants