-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gRPC Client leaks memory for messages and streams #124
Comments
Turns out I haven't written a LV memory leak test in awhile and I forgot to disconnect a build array on the While loop. When I remove this, the above test doesn't leak. Closing this issue due to user error. Sorry about that! |
After further testing, the approach I was using in the VI to monitor memory wasn't catching the leak. If you look at Task Manager, you will see the LabVIEW process memory continuously growing quickly more than 1MB every 10 seconds when sending unary messages. Using streaming data doesn't have this problem. |
I did some more digging into the memory leak for grpc. When I changed the cluster to be two strings instead of a string and byte array it no longer leaks in 32-bit LV (still leaked in 64-bit LV, but I didn't try the 64-bit leak fix yet). Our desired cluster is a string and byte array which leaks in both 32 and 64-bit LV. I think the leak may be in ClusterDataCopier::CopyToCluster. Here's a simple example to test the leak. Run the server test VI first and then the client. |
I tried with the latest 64-bit leak fix and it this leaks on 64-bit LV with either a cluster of 2 strings or a cluster with a sting and byte array. |
The issue has been fixed in #132 and has been picked up by the 0.4.9.1 release. |
We have memory leaks with both unary messages and streaming data:
stream leak is proportional to data size. For small strings, I saw 50 bytes/iteration and with a string size of 1KB, the leak went up to 800bytes/iteration.
Unary messages were always leaking about 10 bytes/message regardless of the message size.
I modified this example and called it from two different LV process to verify the leak only happens on the client side for both messages and streams. I also tried with 32 and 64-bit LV 2021 and both have the problem.
I'm attaching a simple example to illustrate the leak
leak test.zip
.
The text was updated successfully, but these errors were encountered: