Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak in NN ensemble backend #674

Closed
juhoinkinen opened this issue Feb 21, 2023 · 0 comments · Fixed by #677
Closed

Memory leak in NN ensemble backend #674

juhoinkinen opened this issue Feb 21, 2023 · 0 comments · Fixed by #677
Labels
Milestone

Comments

@juhoinkinen
Copy link
Member

juhoinkinen commented Feb 21, 2023

The Annif pod in the OpenShift environment has been occasionally killed, maybe once in every two weeks. The reason has seemed to be that the memory consumption has reached the limit for the pod (30 GB).

I monitored the memory consumption (RssAnon from /proc/$PID/status) of locally run Annif when running suggest requests to NN ensemble project and its base projects with curl (using fulltext documents from JYU test set; the memory consumption was recorded after every 10 documents), but only in the case of NN ensemble there was a strong increase in the memory consumption: see below for a run for the yso-fi model of Finto AI.

image

I confirmed that the issue could be fixed by following the advice from one relevant discussion, i.e. to use __call__() of the model:

results = self._model(np.expand_dims(score_vector.transpose(), 0)).numpy()

The other mentioned fix, that is applying tf.convert_to_tensor(), did not fix the memory leak; running (also) gc.collect() after each prediction did fix it, but the predictions become very slow (10 requests took ~110 s, when without gc only ~30 s).

However, the NN ensemble could be modified to allow batch processing of the documents, and for that use the Keras documentation seems to suggest using the predict() function, so I'm not sure if the mentioned fix is the best way to go.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants