-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
High memory working set in azure function #1339
Comments
Original issue is here: Azure/azure-functions-durable-extension#886 Fact 1: Fact 2: Fact 3: Last update from azure support: When I load the memory dump, I just see lot of Regex in the memory |
@andrew-vdb thanks for filing a separate issue for this. I would like to assist support with the investigation. Do you have a support case number you can share here? |
@paulbatum Thank you! I hope we will find the cause of this memory spike 119080522001676 119072422000746 Private bytes, its what I expect from my application, assuming that "m" is "MB"? |
I have provided support some assistance with this case, but just in the interests of sharing information for anyone else that comes across this issue and is curious:
|
Chart from "log analytics" 24 hours Chart from "log analytics" 48 hours
I already optimize the function which may cause the spike (chunked upload ms graph), The next thing I will do is to optimize durable function replay |
@andrew-vdb Thanks for sharing those screenshots. 110mb for RegexInterpreter definitely seems suspicious. What's even more suspicious is the fact that your last screenshot does not show any of your code, or functions runtime code, referencing the RegexInterpreter. In contrast, it should look more like this: |
Here's something you could try - select the large RegexInterpreter and then go to the referenced objects view: See if the large amount of memory usage can be attributed to a particular type. Maybe there is a really large string? My screenshot above shows a 90K string. If you see a large string, you can inspect it. This might give you a clue of what is creating this memory usage. |
I think i found the culprit, I will wait until my tester test it for sure, thanks for your help @paulbatum |
@andrew-vdb glad to hear it! |
@paulbatum you have to train your support, they keep talking and investigating "Memory working set" instead of "private bytes".
I assume calculation of cost is based on "private bytes" instead of "memory working set" ?
Because I have this strange graph
![image](https://user-images.githubusercontent.com/12871379/65665540-0fd83d00-e03c-11e9-9ca0-db80c77306b8.png)
The memory working set go high
The first peak point has less work load from the second peak point
Until I check the private bytes which makes more sense
![image](https://user-images.githubusercontent.com/12871379/65665626-4746e980-e03c-11e9-9c34-b552d1618767.png)
Another problem is that the auto healing (when they restart our application) is based on "memory working set"
a bit out of topic, how can I have high memory working set in azure function? what might be the cause of this? Is it caused by appinsight or webjobdashboard?
The text was updated successfully, but these errors were encountered: