Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

future tasks should not cause anything to happen if the instance is suspended #952

Merged
merged 8 commits into from
Feb 6, 2024

Conversation

burnettk
Copy link
Contributor

@burnettk burnettk commented Feb 2, 2024

add column for archived future tasks, for use when instances are suspended
set archived back to false when the instance resumes
add some tests for related background processing service functionality

Summary by CodeRabbit

  • New Features
    • Added a field to track archived status for future tasks in the FutureTaskModel class.
    • Enhanced task origin tracking by introducing a new identifier in the args_to_celery dictionary.
    • Updated the resume method in the ProcessInstanceProcessor to handle archived future tasks.
  • Tests
    • Introduced test cases for the BackgroundProcessingService class to cover future task processing scenarios.
  • Chores
    • Revised the auto-merge workflow to require PR approval before enabling auto-merge.

Copy link
Contributor

coderabbitai bot commented Feb 2, 2024

Warning

Rate Limit Exceeded

@jasquat has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 28 minutes and 7 seconds before requesting another review.

How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.
Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.
Please see our FAQ for further information.

Commits Files that changed from the base of the PR and between af1eb6f and 5bf7a9e.

Walkthrough

The recent updates to the SpiffWorkflow backend aim to optimize task management and improve efficiency. Changes include refactoring task processing methods, introducing new task handling capabilities, enhancing task origin tracking, and improving task queuing and archiving based on process instance statuses. These modifications are geared towards enhancing task management processes and boosting system reliability.

Changes

Files Change Summary
.../background_processing/.../background_processing_service.py
.../models/process_instance.py
- Refactored process_future_tasks into do_process_future_tasks method.
- Introduced imminent_future_tasks method.
- Enhanced task queuing and archiving logic.
- Added future task querying and run permission methods.
- Modified bpmn_process_definition_id declaration for clarity.
.../background_processing/.../celery_tasks/process_instance_task_producer.py - Added producer_identifier in args_to_celery for task origin tracking.
.../models/future_task.py - Included archived_for_process_instance_status boolean field.
.../services/.../process_instance_processor.py - Imported FutureTaskModel.
- Updated methods for managing archived future tasks.
.../services/process_instance_service.py - Imported TaskModel.
.../tests/.../test_background_processing_service.py - Added test cases for BackgroundProcessingService class.
.github/workflows/auto-merge-dependabot-prs.yml - Changed the merge process to a two-step approval and auto-merge setup.

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share

Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit-tests for this file.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit tests for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository from git and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit tests.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

CodeRabbit Discord Community

Join our Discord Community to get help, request features, and share feedback.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 2

Configuration used: .coderabbit.yaml

Commits Files that changed from the base of the PR and between 05b50df and c291aea.
Files ignored due to path filters (1)
  • spiffworkflow-backend/migrations/versions/acf20342181e_.py is excluded by: !spiffworkflow-backend/migrations/**
Files selected for processing (7)
  • spiffworkflow-backend/src/spiffworkflow_backend/background_processing/background_processing_service.py (2 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/background_processing/celery_tasks/process_instance_task_producer.py (1 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/models/future_task.py (1 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance.py (4 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_processor.py (3 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_service.py (1 hunks)
  • spiffworkflow-backend/tests/spiffworkflow_backend/unit/test_background_processing_service.py (1 hunks)
Additional comments: 7
spiffworkflow-backend/src/spiffworkflow_backend/background_processing/celery_tasks/process_instance_task_producer.py (1)
  • 19-24: The addition of "producer_identifier": "future_task" to args_to_celery is a good enhancement for tracking the origin of tasks in the queue. This change should improve the system's ability to trace task execution paths and diagnose issues related to task processing.
spiffworkflow-backend/src/spiffworkflow_backend/models/future_task.py (1)
  • 20-20: The addition of the archived_for_process_instance_status boolean column to the FutureTaskModel is a significant change that aligns with the PR's objective to manage future tasks in suspended process instances effectively. This field will enable the system to archive future tasks appropriately and ensure they are only processed when their corresponding process instances are active.
spiffworkflow-backend/tests/spiffworkflow_backend/unit/test_background_processing_service.py (1)
  • 13-66: The tests added for BackgroundProcessingService cover scenarios for processing future tasks with and without processable future tasks, which is crucial for validating the new archiving mechanism and the processing logic for future tasks. These tests ensure that the system behaves as expected when future tasks are to be processed or archived based on the status of their corresponding process instances. It's good to see comprehensive test coverage for these new functionalities.
spiffworkflow-backend/src/spiffworkflow_backend/background_processing/background_processing_service.py (1)
  • 61-103: Refactoring process_future_tasks into a class method do_process_future_tasks and adding imminent_future_tasks for handling future tasks more efficiently is a positive change. This refactoring improves the modularity and readability of the code. However, ensure that the logic for archiving future tasks (future_task.archived_for_process_instance_status = True) and the subsequent database commit is thoroughly tested, especially in high-concurrency environments, to avoid any race conditions or inconsistencies.
spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance.py (2)
  • 125-133: The future_tasks_query method is a valuable addition to the ProcessInstanceModel, providing a straightforward way to retrieve future tasks related to a specific process instance. This method enhances the model's capabilities and aligns with the PR's objectives to manage future tasks effectively.
  • 170-172: The allowed_to_run method is a simple yet effective way to determine if a process instance can proceed based on its status. This method's addition is crucial for the new logic introduced in the PR, ensuring that tasks are processed only when appropriate.
spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_service.py (1)
  • 45-45: The addition of the TaskModel import statement is necessary for the new functionalities introduced in the PR. It's good to see that dependencies are being managed correctly, ensuring that the new code has access to the required models.

Comment on lines +1815 to +1823
def bring_archived_future_tasks_back_to_life(self) -> None:
archived_future_tasks = (
self.process_instance_model.future_tasks_query()
.filter(FutureTaskModel.archived_for_process_instance_status == True) # noqa: E712
.all()
)
for archived_future_task in archived_future_tasks:
archived_future_task.archived_for_process_instance_status = False
db.session.add(archived_future_task)
Copy link
Contributor

@coderabbitai coderabbitai bot Feb 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The method bring_archived_future_tasks_back_to_life correctly updates the archived_for_process_instance_status flag for archived future tasks. However, it's important to commit the session changes within the method to ensure data consistency and handle potential exceptions gracefully.

+ db.session.commit()

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
def bring_archived_future_tasks_back_to_life(self) -> None:
archived_future_tasks = (
self.process_instance_model.future_tasks_query()
.filter(FutureTaskModel.archived_for_process_instance_status == True) # noqa: E712
.all()
)
for archived_future_task in archived_future_tasks:
archived_future_task.archived_for_process_instance_status = False
db.session.add(archived_future_task)
def bring_archived_future_tasks_back_to_life(self) -> None:
archived_future_tasks = (
self.process_instance_model.future_tasks_query()
.filter(FutureTaskModel.archived_for_process_instance_status == True) # noqa: E712
.all()
)
for archived_future_task in archived_future_tasks:
archived_future_task.archived_for_process_instance_status = False
db.session.add(archived_future_task)
db.session.commit()

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we don't want this logical transaction to partially succeed. the commit at the end of the resume method is when the logical transaction is complete and when we want to commit.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@burnettk, understood. It makes sense to manage the transaction commit at the end of the resume method to ensure the entire logical transaction completes successfully as a whole. Thank you for clarifying your approach to transaction management.

def resume(self) -> None:
self.process_instance_model.status = ProcessInstanceStatus.waiting.value
db.session.add(self.process_instance_model)
self.bring_archived_future_tasks_back_to_life()
Copy link
Contributor

@coderabbitai coderabbitai bot Feb 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The resume method correctly calls bring_archived_future_tasks_back_to_life to unarchive future tasks. However, it's recommended to handle potential exceptions and ensure the database session is committed only if all operations succeed to maintain data integrity.

Consider wrapping the operations in a try-except block and commit the session in a finally block if no exceptions occur.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since it is committing at the end of the resume method, it should be logically equivalent to what you're suggesting. it will only commit if all operations succeed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@burnettk You're absolutely right. The commit at the end of the resume method does ensure that all operations within the method must succeed for the changes to be persisted, effectively maintaining data integrity as you've described. This approach indeed makes the explicit use of a try-except block for transaction management unnecessary in this context. Thank you for pointing this out and clarifying.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Configuration used: .coderabbit.yaml

Commits Files that changed from the base of the PR and between c291aea and e037365.
Files selected for processing (1)
  • spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance.py (5 hunks)
Files skipped from review as they are similar to previous changes (1)
  • spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance.py

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 2

Configuration used: .coderabbit.yaml

Commits Files that changed from the base of the PR and between e037365 and 347ef3e.
Files selected for processing (1)
  • .github/workflows/auto-merge-dependabot-prs.yml (1 hunks)

Comment on lines 76 to 84
# the repo requires one approval. if a dependabot change passes tests, that is good enough.
- name: approve pr
run: gh pr review --approve "$(cat pr_number)"
env:
GITHUB_TOKEN: ${{secrets.GITHUB_TOKEN}}
# if the merge --auto flag were added, and if the repo allowed it at https://github.com/sartography/spiff-arena/settings,
# it would set up the pr to auto merge when all requirements were met. but we just want to merge now.
- name: set up pr to auto merge when all requirements are met
run: gh pr merge --squash "$(cat pr_number)"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The command unzip pr_number.zip lacks a -d option to specify the extraction directory, which could lead to unexpected behavior or file placement. Consider specifying a target directory for clarity and to ensure files are extracted to a known location.

- run: unzip pr_number.zip
+ run: unzip pr_number.zip -d ${{ github.workspace }}

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
# the repo requires one approval. if a dependabot change passes tests, that is good enough.
- name: approve pr
run: gh pr review --approve "$(cat pr_number)"
env:
GITHUB_TOKEN: ${{secrets.GITHUB_TOKEN}}
# if the merge --auto flag were added, and if the repo allowed it at https://github.com/sartography/spiff-arena/settings,
# it would set up the pr to auto merge when all requirements were met. but we just want to merge now.
- name: set up pr to auto merge when all requirements are met
run: gh pr merge --squash "$(cat pr_number)"
# the repo requires one approval. if a dependabot change passes tests, that is good enough.
- name: approve pr
run: gh pr review --approve "$(cat pr_number)"
env:
GITHUB_TOKEN: ${{secrets.GITHUB_TOKEN}}
# if the merge --auto flag were added, and if the repo allowed it at https://github.com/sartography/spiff-arena/settings,
# it would set up the pr to auto merge when all requirements were met. but we just want to merge now.
- name: set up pr to auto merge when all requirements are met
run: gh pr merge --squash "$(cat pr_number)"

Comment on lines 83 to 84
- name: set up pr to auto merge when all requirements are met
run: gh pr merge --squash "$(cat pr_number)"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The gh pr merge --squash "$(cat pr_number)" command is intended to set up the PR for auto-merge but actually performs a squash and merge immediately. If the intention is to enable auto-merge, the correct command should use the --auto flag instead of performing an immediate merge.

- run: gh pr merge --squash "$(cat pr_number)"
+ run: gh pr merge --auto "$(cat pr_number)"

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
- name: set up pr to auto merge when all requirements are met
run: gh pr merge --squash "$(cat pr_number)"
- name: set up pr to auto merge when all requirements are met
run: gh pr merge --auto "$(cat pr_number)"

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Configuration used: .coderabbit.yaml

Commits Files that changed from the base of the PR and between 347ef3e and 5d081ec.
Files selected for processing (3)
  • spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance.py (6 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_processor.py (2 hunks)
  • spiffworkflow-backend/tests/spiffworkflow_backend/unit/test_background_processing_service.py (1 hunks)
Files skipped from review as they are similar to previous changes (3)
  • spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance.py
  • spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_processor.py
  • spiffworkflow-backend/tests/spiffworkflow_backend/unit/test_background_processing_service.py

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Configuration used: .coderabbit.yaml

Commits Files that changed from the base of the PR and between 5d081ec and af1eb6f.
Files ignored due to path filters (1)
  • spiffworkflow-backend/migrations/versions/29b261f5edf4_.py is excluded by: !spiffworkflow-backend/migrations/**
Files selected for processing (1)
  • spiffworkflow-backend/src/spiffworkflow_backend/models/future_task.py (2 hunks)
Files skipped from review as they are similar to previous changes (1)
  • spiffworkflow-backend/src/spiffworkflow_backend/models/future_task.py

@burnettk burnettk merged commit a61f2d6 into main Feb 6, 2024
22 checks passed
@burnettk burnettk deleted the feature/future-tasks-when-suspended branch February 6, 2024 19:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants