-
Notifications
You must be signed in to change notification settings - Fork 0
Replays Self-Serve Bulk Delete System #77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: tenki/base-5
Are you sure you want to change the base?
Changes from all commits
e6c0196
ba199fa
668229d
a6e22c7
4549d79
1a81517
37375d9
976bb0b
d088de3
8a179d9
4f92a94
a37f99f
38d4d58
78d346e
5cc91e0
084b63a
b364405
8167cc5
5d66a4f
fd4ab7b
31c51d5
b4d3000
02695f9
4c997de
2e992ba
b3f40e5
de377a6
786861a
c05a0c6
9f84049
c9fee3f
ea188e2
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -139,6 +139,10 @@ class PullRequestFile: | |||||
| patch: str | ||||||
|
|
||||||
|
|
||||||
| ISSUE_TITLE_MAX_LENGTH = 50 | ||||||
| MERGED_PR_SINGLE_ISSUE_TEMPLATE = "* ‼️ [**{title}**]({url}){environment}\n" | ||||||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🔴 Double newline between issue list items due to In MERGED_PR_SINGLE_ISSUE_TEMPLATE = "* ‼️ [**{title}**]({url}){environment}
"Both issue_list = "
".join([self.get_merged_pr_single_issue_template(...) for issue in issues])This produces `item1 item2 item3 💡 Suggestion: Remove the trailing
Suggested change
📋 Prompt for AI AgentsIn src/sentry/integrations/source_code_management/commit_context.py at line 143, remove the trailing |
||||||
|
|
||||||
|
|
||||||
| class CommitContextIntegration(ABC): | ||||||
| """ | ||||||
| Base class for integrations that include commit context features: suspect commits, suspect PR comments | ||||||
|
|
@@ -570,6 +574,37 @@ def get_top_5_issues_by_count( | |||||
| ) | ||||||
| return raw_snql_query(request, referrer=self.referrer.value)["data"] | ||||||
|
|
||||||
| @staticmethod | ||||||
| def _truncate_title(title: str, max_length: int = ISSUE_TITLE_MAX_LENGTH) -> str: | ||||||
| """Truncate title if it's too long and add ellipsis.""" | ||||||
| if len(title) <= max_length: | ||||||
| return title | ||||||
| return title[:max_length].rstrip() + "..." | ||||||
|
|
||||||
| def get_environment_info(self, issue: Group) -> str: | ||||||
| try: | ||||||
| recommended_event = issue.get_recommended_event() | ||||||
| if recommended_event: | ||||||
| environment = recommended_event.get_environment() | ||||||
| if environment and environment.name: | ||||||
| return f" in `{environment.name}`" | ||||||
| except Exception as e: | ||||||
| # If anything goes wrong, just continue without environment info | ||||||
| logger.info( | ||||||
| "get_environment_info.no-environment", | ||||||
| extra={"issue_id": issue.id, "error": e}, | ||||||
| ) | ||||||
| return "" | ||||||
|
Comment on lines
+584
to
+597
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🟠 N+1 Snuba and DB queries in get_environment_info called per issue in loop (performance)
With up to 5 issues passed (from Relevant code in def get_environment_info(self, issue: Group) -> str:
try:
recommended_event = issue.get_recommended_event() # Snuba query
if recommended_event:
environment = recommended_event.get_environment() # DB query
...And caller in issue_list = "
".join([
self.get_merged_pr_single_issue_template(
...
environment=self.get_environment_info(issue), # called per-issue
)
for issue in issues
])💡 Suggestion: Consider batching the environment lookups before the list comprehension. The recommended event Snuba query could be replaced by reading environment data from the existing 📋 Prompt for AI AgentsIn src/sentry/integrations/source_code_management/commit_context.py, the |
||||||
|
|
||||||
| @staticmethod | ||||||
| def get_merged_pr_single_issue_template(title: str, url: str, environment: str) -> str: | ||||||
| truncated_title = PRCommentWorkflow._truncate_title(title) | ||||||
| return MERGED_PR_SINGLE_ISSUE_TEMPLATE.format( | ||||||
| title=truncated_title, | ||||||
| url=url, | ||||||
| environment=environment, | ||||||
| ) | ||||||
|
Comment on lines
+599
to
+606
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🟡 @staticmethod get_merged_pr_single_issue_template hardcodes PRCommentWorkflow._truncate_title, bypassing subclass overrides (bug) In @staticmethod
def get_merged_pr_single_issue_template(title: str, url: str, environment: str) -> str:
truncated_title = PRCommentWorkflow._truncate_title(title) # hardcoded class ref
return MERGED_PR_SINGLE_ISSUE_TEMPLATE.format(...)Since 💡 Suggestion: Convert 📋 Prompt for AI AgentsIn src/sentry/integrations/source_code_management/commit_context.py at lines 599-606, |
||||||
|
|
||||||
|
|
||||||
| class OpenPRCommentWorkflow(ABC): | ||||||
| def __init__(self, integration: CommitContextIntegration): | ||||||
|
|
||||||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🔴 drain_mailbox_parallel processing_deadline_duration (120s) is shorter than its internal loop timeout (180s) (bug)
drain_mailbox_parallelusesBATCH_SCHEDULE_OFFSET = timedelta(minutes=BACKOFF_INTERVAL)=timedelta(minutes=3)= 180 seconds as its internaldeadline. The newprocessing_deadline_duration=120tells the taskworker to consider the task failed after only 120 seconds. Because 120 < 180, the taskworker will terminate the task while the task's own loop still believes it has 60 more seconds to deliver webhooks. This can cause partial mailbox drains to be incorrectly treated as failures, potentially causing duplicate retries or message-ordering problems.By contrast,
drain_mailbox(the non-parallel variant) correctly setsprocessing_deadline_duration=300 > 180s, so only the parallel variant is broken.💡 Suggestion: Set
processing_deadline_durationfordrain_mailbox_parallelto at least 180 seconds (matching BATCH_SCHEDULE_OFFSET) plus headroom. Using 300 (matchingdrain_mailbox) is appropriate:Alternatively, define a named constant derived from BACKOFF_INTERVAL to keep the two values in sync automatically.
📋 Prompt for AI Agents
In src/sentry/hybridcloud/tasks/deliver_webhooks.py at line 239, the TaskworkerConfig for drain_mailbox_parallel sets
processing_deadline_duration=120, but the function's internal while-loop runs for up to BATCH_SCHEDULE_OFFSET (timedelta(minutes=3) = 180 seconds). Since 120 < 180, the taskworker kills the task before the internal deadline fires. Changeprocessing_deadline_duration=120toprocessing_deadline_duration=300(matching the drain_mailbox task) to ensure the taskworker deadline is always longer than the internal loop timeout.