Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Resource: Alerts #297

Merged
merged 9 commits into from Jan 3, 2023
Merged

Resource: Alerts #297

merged 9 commits into from Jan 3, 2023

Conversation

anovis
Copy link
Collaborator

@anovis anovis commented Dec 21, 2022

New Alert Resource (closes #278)

You can deploy alerts related to your application by using the alert method. Each alert takes a name and a list of conditions. Notification channels
can be added to the alerts.notification_channel key in config.json or explicity in the alert. The base AlertCondition class allows you to
fully customize your alert based on the fields privided by the GCP Alert Resource

If you do not need a fully customized alert you can use the built in classes for MetricCondition, LogMatchCondition, and CustomMetricCondition. These come with
defaults in terms of duration and aggregations, but can be overriden as needed. The CustomMetricCondition creates a custom metric based on the filter provided and then
creates an alert using that metric.

from goblet.resources.alerts import MetricCondition,LogMatchCondition,CustomMetricCondition
app = Goblet()

# Example Metric Alert for the cloudfunctin metric execution_count with a threshold of 10
app.alert("metric",conditions=[MetricCondition("test", metric="cloudfunctions.googleapis.com/function/execution_count", value=10)])

# Example Log Match metric that will trigger an incendent off of any Error logs
app.alert("error",conditions=[LogMatchCondition("error", "severity>=ERROR")])

# Example Metric Alert that creates a custom metric for severe errors with http code in the 500's and creates an alert with a threshold of 10
app.alert("custom",conditions=[CustomMetricCondition("custom", metric_filter='severity=(ERROR OR CRITICAL OR ALERT OR EMERGENCY) httpRequest.status=(500 OR 501 OR 502 OR 503 OR 504)', value=10)])

@codecov-commenter
Copy link

codecov-commenter commented Jan 3, 2023

Codecov Report

Base: 88.59% // Head: 88.49% // Decreases project coverage by -0.09% ⚠️

Coverage data is based on head (78e8ecd) compared to base (c9d01a4).
Patch coverage: 85.83% of modified lines in pull request are covered.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #297      +/-   ##
==========================================
- Coverage   88.59%   88.49%   -0.10%     
==========================================
  Files          46       48       +2     
  Lines        3709     3939     +230     
==========================================
+ Hits         3286     3486     +200     
- Misses        423      453      +30     
Impacted Files Coverage Δ
goblet/infrastructures/infrastructure.py 72.72% <50.00%> (-8.53%) ⬇️
goblet/decorators.py 84.95% <52.94%> (-1.29%) ⬇️
goblet/infrastructures/alerts.py 84.02% <84.02%> (ø)
goblet/__version__.py 100.00% <100.00%> (ø)
goblet/backends/backend.py 84.68% <100.00%> (+0.28%) ⬆️
goblet/backends/cloudfunctionv1.py 96.22% <100.00%> (+0.14%) ⬆️
goblet/backends/cloudfunctionv2.py 91.07% <100.00%> (+0.33%) ⬆️
goblet/backends/cloudrun.py 90.29% <100.00%> (+0.19%) ⬆️
goblet/client.py 89.61% <100.00%> (+0.42%) ⬆️
goblet/tests/test_alerts.py 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@anovis anovis merged commit 9c7a614 into main Jan 3, 2023
@anovis anovis deleted the resource-alerts branch July 12, 2023 16:06
quajones pushed a commit to quajones/goblet that referenced this pull request Oct 19, 2023
<b>New Alert Resource</b> (closes goblet#278)

You can deploy alerts related to your application by using the alert method. Each alert takes a name and a list of conditions. Notification channels
can be added to the `alerts.notification_channel` key in `config.json` or explicity in the alert. The base `AlertCondition` class allows you to 
fully customize your alert based on the fields privided by the [GCP Alert Resource](https://cloud.google.com/monitoring/api/ref_v3/rest/v3/projects.alertPolicies#conditionhttps://cloud.google.com/monitoring/api/ref_v3/rest/v3/projects.alertPolicies#condition>)

If you do not need a fully customized alert you can use the built in classes for `MetricCondition`, `LogMatchCondition`, and `CustomMetricCondition`. These come with 
defaults in terms of duration and aggregations, but can be overriden as needed. The `CustomMetricCondition` creates a custom metric based on the filter provided and then 
creates an alert using that metric.  


    from goblet.resources.alerts import MetricCondition,LogMatchCondition,CustomMetricCondition
    app = Goblet()
    
    # Example Metric Alert for the cloudfunctin metric execution_count with a threshold of 10
    app.alert("metric",conditions=[MetricCondition("test", metric="cloudfunctions.googleapis.com/function/execution_count", value=10)])

    # Example Log Match metric that will trigger an incendent off of any Error logs
    app.alert("error",conditions=[LogMatchCondition("error", "severity>=ERROR")])

    # Example Metric Alert that creates a custom metric for severe errors with http code in the 500's and creates an alert with a threshold of 10
    app.alert("custom",conditions=[CustomMetricCondition("custom", metric_filter='severity=(ERROR OR CRITICAL OR ALERT OR EMERGENCY) httpRequest.status=(500 OR 501 OR 502 OR 503 OR 504)', value=10)])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Alerts] Infrastructure to create log metrics and monitoring alerts
2 participants