-
Notifications
You must be signed in to change notification settings - Fork 395
Jordan.storms/add init duration metric and cold_start tags #274
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Jordan.storms/add init duration metric and cold_start tags #274
Conversation
|
Can you add a test case for the new metric and tag https://github.com/DataDog/datadog-serverless-functions/blob/master/aws/logs_monitoring/tests/test_enhanced_lambda_metrics.py? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 LGTM! One last thing, do you mind check all the boxes that apply from the PR template?
| # if cold_start: | ||
| if regex_match.group(INIT_DURATION_METRIC_NAME): | ||
| metric_point_value = float(regex_match.group(INIT_DURATION_METRIC_NAME)) | ||
| # Multiply by 1/1000 to convert ms to seconds | ||
| metric_point_value *= METRIC_ADJUSTMENT_FACTORS[INIT_DURATION_METRIC_NAME] | ||
|
|
||
| initial_duration = DatadogMetricPoint( | ||
| "{}.{}".format( | ||
| ENHANCED_METRICS_NAMESPACE_PREFIX, INIT_DURATION_METRIC_NAME | ||
| ), | ||
| metric_point_value, | ||
| ) | ||
|
|
||
| initial_duration.add_tags(tags) | ||
|
|
||
| metrics.append(initial_duration) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jcstorms1 my apologies, after a weekend, I totally forgot that another suggestion I pointed out last week was merging line 454-469 (after change) to the for loop of line 471 (after change), their logic are 99% same. I suggest adding INIT_DURATION_METRIC_NAME to METRICS_TO_PARSE_FROM_REPORT and add a check within the for loop like this
for metric_name in METRICS_TO_PARSE_FROM_REPORT:
if not regex_match.group(metric_name):
continue
metric_point_value = float(regex_match.group(metric_name))
....
What does this PR do?
Adds an
init durationenhanced metric for cold startsAdds a
cold_starttag for all metricsMotivation
Testing Guidelines
Additional Notes
Types of changes
Check all that apply