Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: why is the MemorySize for DDBStreamsFunction 3072 ? #27

Closed
mostalive opened this issue Nov 22, 2021 · 2 comments
Closed

Question: why is the MemorySize for DDBStreamsFunction 3072 ? #27

mostalive opened this issue Nov 22, 2021 · 2 comments

Comments

@mostalive
Copy link

Hi, again thanks for putting this sample together, it is much appreciated.

I'm wondering why, unlike the other lambda functions that have 128MB as default, DDBStreamsFunction has MemorySize: 3072. I couldn't find it in the commit history, and it seems to use 24MB or so (which is great!) when running the tests.

@nmoutschen
Copy link
Contributor

Hi @mostalive !

My original reasoning was that, since DDBStreamsFunction will process batches of up to 1000 events, I should allocate more memory to the function to also get more CPU and network resources to speed up the transformation. So this was not driven by memory usage, but for CPU/networking resources.

In reality, the function spends a lot of time waiting for the EventBridge PutRecords API anyway. I just did a test at 128MB and 3072MB under heavy load to compare. The functions takes on average 700ms at 128MB, versus 200ms at 3072MB. As this function is not latency-sensitive, it's best to optimize on costs. I'll make a change lowering the memory to 128MB.

@mostalive
Copy link
Author

mostalive commented Nov 22, 2021

Hi @nmoutschen, thanks for the explanation and the change!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants