-
Notifications
You must be signed in to change notification settings - Fork 3.4k
Description
Describe the issue
It seems several years ago the behavior of OnnxSequence.java changed to no longer free up memory when it closes. This can lead to a memory leak that can will eventually eat up all your memory an crash your container. The change appears to be intentional in this PR #13012 by @Craigacp for performance reasons. Since OnnxValue implements AutoCloseable it feels like the default behavior should be to close each OnnxTensor it creates. I think this is the behavior of the other classes that implement OnnxValue as well.
I think a reasonable solution would be to have OnnxSequence close its OnnxTensor's by default but allow for the existing behavior by adding a method getValueUnmanaged(). I'm happy to work on a pull request, but thought I would gather some feedback first. Thanks!
To reproduce
public float[] predict(float[][] featureVectorArray) throws OrtException {
try (OnnxTensor onnxTensor = OnnxTensor.createTensor(ortEnvironment, featureVectorArray);
Result result = session.run(Collections.singletonMap(inputName, onnxTensor))) {
OnnxValue onnxValue = result.get(1);
List<OnnxMap> onnxProbabilities = (List<OnnxMap>) onnxValue.getValue();
float[] probabilities = new float[onnxProbabilities.size()];
for (int i = 0; i < onnxProbabilities.size(); i++) {
// THESE VALUES ARE NOT CLOSED
probabilities[i] = (float) onnxProbabilities.get(i).getValue().get(1L);
}
return probabilities;
}
Urgency
No response
Platform
Linux
OS Version
Docker with azurelinux 2.0
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.20.0
ONNX Runtime API
Java
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response