Skip to content

feat: Add Safety evaluator metric #1737

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 2, 2025
Merged

Conversation

copybara-service[bot]
Copy link

feat: Add Safety evaluator metric

We add a new metric for evaluating safety of Agent's response to ADK Eval. We delegate the actual implementation to Vertex Gen AI Eval SDK, so using this metric will require GCP project.

As a part of this change, we created (refactored) a simple Facade for vertex gen ai eval sdk.

We add a new metric for evaluating safety of Agent's response to ADK Eval. We delegate the actual implementation to Vertex Gen AI Eval SDK, so using this metric will require GCP project.

As a part of this change, we created (refactored) a simple Facade for vertex gen ai eval sdk.

PiperOrigin-RevId: 778580406
@copybara-service copybara-service bot force-pushed the copybara/777815299 branch from 37f8433 to 0bd05df Compare July 2, 2025 18:30
@copybara-service copybara-service bot merged commit 0bd05df into main Jul 2, 2025
@copybara-service copybara-service bot deleted the copybara/777815299 branch July 2, 2025 18:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant