-
-
Notifications
You must be signed in to change notification settings - Fork 4.5k
feat(llm-detector): Add config for LLM Detected issues #103158
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: AI Issues Misclassified as Performance
The eventOccurrenceTypeToIssueCategory function incorrectly categorizes AI detected issues (occurrence type 3501) as IssueCategory.PERFORMANCE instead of IssueCategory.AI_DETECTED. Since 3501 is >= 1000, it matches the performance condition. This prevents aiDetectedConfig from ever being applied, causing AI detected issues to use performance configuration instead.
static/app/utils/issueTypeConfig/index.tsx#L124-L129
sentry/static/app/utils/issueTypeConfig/index.tsx
Lines 124 to 129 in ae3104f
| hasTitle && | |
| !!getErrorHelpResource({title: params.title!, project}) | |
| ); | |
| } | |
| const eventOccurrenceTypeToIssueCategory = (eventOccurrenceType: number) => { |
| [IssueType.PROFILE_FUNCTION_REGRESSION]: RegressionEvidence, | ||
| [IssueType.QUERY_INJECTION_VULNERABILITY]: DBQueryInjectionVulnerabilityEvidence, | ||
| [IssueType.WEB_VITALS]: WebVitalsEvidence, | ||
| [IssueType.LLM_DETECTED_EXPERIMENTAL]: AIDetectedSpanEvidence, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we generally be consistent with AI detected vs LLM detected?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In my opinion - AI Detected is what we are calling the issue category,LLM detected so far is what we are calling the type. so for now if we can stay consistent with that, i think we are fine (changing these later isn't too difficult)
pr adds a new config for the ai detected issue category which allows more control over what we show on the issue details page fixes https://linear.app/getsentry/issue/ID-1084/add-new-issue-type-config-for-new-issue-category
pr adds a new config for the ai detected issue category which allows more control over what we show on the issue details page fixes https://linear.app/getsentry/issue/ID-1084/add-new-issue-type-config-for-new-issue-category
pr adds a new config for the ai detected issue category which allows more control over what we show on the issue details page
fixes https://linear.app/getsentry/issue/ID-1084/add-new-issue-type-config-for-new-issue-category