-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Memory leak in flair models #3378
Comments
Thanks for reporting this! @helpmefindaname can you check? |
Hi @ganga7445 can you please upgrade to the latest version of flair and check if the issue persists? |
@helpmefindaname @alanakbik |
Okay, |
@helpmefindaname |
Is this a leak . . . or a configuration issue? See - https://stackoverflow.com/questions/55322434/how-to-clear-cuda-memory-in-pytorch |
@ganga7445 from memory_profiler import profile
from flair.models import SequenceTagger
import torch
tagger = SequenceTagger.load("ner")
from flair.data import Sentence
@profile
def main():
for text in ["Financial Chair, Chi Phi Fraternity I Tuscaloosa, AL Fall 2019",
"Fort Smith Housekeeping Opportunities - Evenings, Data Scientist"]:
sentence = Sentence(text)
tagger.predict(sentence)
entities = sentence.get_spans('ner')
for entity in entities:
print(f"Entity: {entity.text}, Label: {entity.tag}")
torch.cuda.empty_cache()
if __name__ == "__main__":
main() then running
With the numbers on line 14 & 15 varying from run to run, but staying with negative increment. |
@helpmefindaname |
Okay, but as long as I cannot verify your claims and reproduce them, I cannot help you. |
I can confirm data leak as well, when I run model on pandas dataframe (text variable). |
@danilyef |
@helpmefindaname from flair.data import Sentence tagger = SequenceTagger.load('ner') @Profile prdict_test(texts) using flair 0.13.1. does the issue with version of flair? |
@alanakbik can you please check once? |
Describe the bug
I have fine-tuned the flair in my own ner dataset and using the below code. i have used xlmr transformer embeddings.
predict method is giving more memory leakage. @helpmefindaname
flair version is : 0.12.2
The
To Reproduce
Expected behavior
Ideally, there should be no memory leaks. During testing with lower versions of Flair, no memory leaks were observed.
Logs and Stack traces
No response
Screenshots
No response
Additional Context
No response
Environment
Versions:
Flair
0.12.2
Pytorch
2.0.1+cu117
Transformers
4.30.2
GPU
True
The text was updated successfully, but these errors were encountered: