-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compressed data table #1569
Compressed data table #1569
Conversation
WalkthroughWalkthroughThe recent updates to the Changes
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Review Details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Files ignored due to path filters (1)
spiffworkflow-backend/migrations/versions/ada8eab55f1a_.py
is excluded by!spiffworkflow-backend/migrations/**
Files selected for processing (11)
- spiffworkflow-backend/src/spiffworkflow_backend/load_database_models.py (1 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/models/compressed_data.py (1 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance_report.py (4 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/models/task.py (2 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/models/task_draft_data.py (2 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/routes/process_api_blueprint.py (2 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/routes/process_instances_controller.py (3 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/routes/tasks_controller.py (5 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_processor.py (5 hunks)
- spiffworkflow-backend/src/spiffworkflow_backend/services/task_service.py (9 hunks)
- spiffworkflow-backend/tests/spiffworkflow_backend/unit/test_process_instance_processor.py (1 hunks)
Files skipped from review due to trivial changes (1)
- spiffworkflow-backend/tests/spiffworkflow_backend/unit/test_process_instance_processor.py
Additional comments not posted (29)
spiffworkflow-backend/src/spiffworkflow_backend/models/task_draft_data.py (2)
14-14
: Import ofCompressedDataModel
correctly replacesJsonDataModel
to align with the new data handling strategy.
51-51
: Updated methodget_saved_form_data
to useCompressedDataModel
for data retrieval, aligning with the new data handling strategy.spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance_report.py (3)
8-8
: Correctly updated import to useCompressedDataModel
, supporting the new data handling requirements.
93-93
: Methodget_report_metadata
correctly updated to retrieve data usingCompressedDataModel
, aligning with the new compressed data strategy.
117-117
: Methodcreate_report
appropriately updated to useCompressedDataModel
for creating and inserting compressed data, aligning with the updated data handling strategy.spiffworkflow-backend/src/spiffworkflow_backend/models/compressed_data.py (1)
27-93
: TheCompressedDataModel
class is well-implemented with methods for handling compressed data efficiently using gzip and SHA256. The database interaction methods are correctly adapted for different database types, ensuring compatibility.spiffworkflow-backend/src/spiffworkflow_backend/load_database_models.py (1)
72-72
: Import ofCompressedDataModel
correctly added to ensure it is recognized by SQLAlchemy and available throughout the application.spiffworkflow-backend/src/spiffworkflow_backend/models/task.py (3)
16-16
: Import ofCompressedDataModel
correctly updated to support the new data handling strategy.
90-90
: Methodpython_env_data
correctly updated to retrieve data usingCompressedDataModel
, aligning with the new compressed data strategy.
93-93
: Methodjson_data
appropriately updated to useCompressedDataModel
for data retrieval, aligning with the updated data handling strategy.spiffworkflow-backend/src/spiffworkflow_backend/routes/process_instances_controller.py (3)
28-28
: Import ofCompressedDataModel
correctly updated to support the new data handling strategy.
185-185
: Methodprocess_instance_list
correctly updated to useCompressedDataModel
for creating and inserting compressed data, aligning with the updated data handling strategy.
205-205
: Methodprocess_instance_report_show
correctly updated to retrieve data usingCompressedDataModel
, aligning with the new compressed data strategy.spiffworkflow-backend/src/spiffworkflow_backend/services/task_service.py (9)
18-19
: Imports ofCompressedDataDict
andCompressedDataModel
have been added to replaceJsonDataModel
.
126-126
: Initialization ofcompressed_data_dicts
dictionary to store compressed data.
136-136
: Usage ofCompressedDataModel.insert_or_update_compressed_data_records
to handle bulk updates of compressed data.
208-212
: Methodupdate_task_data_on_bpmn_process
is used to update BPMN process data and store the resulting compressed data dictionary.
262-264
: Repetition of the pattern seen in lines 208-212, confirming consistent use ofCompressedDataModel
for handling compressed data.
301-310
: The methodupdate_json_data_on_db_model_and_return_dict_if_updated
is used to update the task model with compressed data, replacing the previous JSON data handling.
429-433
: The methodupdate_task_data_on_bpmn_process
is used withinadd_bpmn_process
to handle compressed data for new BPMN processes.
539-549
: Implementation ofupdate_task_data_on_bpmn_process
to handle the conversion of data to compressed format and update the BPMN process hash accordingly.
554-558
: Methodupdate_json_data_on_db_model_and_return_dict_if_updated
is adapted to handle compressed data, ensuring that the hash is updated only if it changes.spiffworkflow-backend/src/spiffworkflow_backend/routes/tasks_controller.py (4)
29-29
: Import ofCompressedDataModel
is correct as per the PR objectives.
261-265
: The integration ofCompressedDataModel
intask_data_update
is correctly implemented.
593-599
: The use ofCompressedDataModel
in_interstitial_stream
for fetching and decompressing data is correctly implemented.
Line range hint
728-747
: The integration ofCompressedDataModel
intask_save_draft
is correctly implemented.Verification successful
The previous script failed due to an incorrect file type specification for
rg
. Let's correct this and re-run the script to verify the integration ofCompressedDataModel
methods with other components in the codebase.
The
CompressedDataModel
methods are indeed integrated with various components across the codebase, as evidenced by the search results. This confirms that the integration is widespread and consistent.
- Files and lines where
CompressedDataModel
is used:
spiffworkflow-backend/src/spiffworkflow_backend/models/task_draft_data.py
spiffworkflow-backend/src/spiffworkflow_backend/models/task.py
spiffworkflow-backend/src/spiffworkflow_backend/routes/process_instances_controller.py
spiffworkflow-backend/src/spiffworkflow_backend/routes/process_api_blueprint.py
spiffworkflow-backend/src/spiffworkflow_backend/services/task_service.py
spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_processor.py
spiffworkflow-backend/src/spiffworkflow_backend/load_database_models.py
spiffworkflow-backend/src/spiffworkflow_backend/models/process_instance_report.py
spiffworkflow-backend/src/spiffworkflow_backend/routes/tasks_controller.py
spiffworkflow-backend/src/spiffworkflow_backend/models/compressed_data.py
Given this extensive usage, the integration appears to be thorough and well-implemented.
Scripts Executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify that `CompressedDataModel` methods are correctly integrated with other components. # Test: Search for usage of `CompressedDataModel` methods in other parts of the codebase. rg --type python $'CompressedDataModel'Length of output: 69
Script:
#!/bin/bash # Description: Verify that `CompressedDataModel` methods are correctly integrated with other components. # Test: Search for usage of `CompressedDataModel` methods in other parts of the codebase. rg 'CompressedDataModel' --glob '*.py'Length of output: 5854
spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_processor.py (3)
70-70
: Import ofCompressedDataModel
is correctly added to replaceJsonDataModel
.
668-669
: Usage ofCompressedDataModel.find_data_dict_by_hash
correctly replaces the previousJsonDataModel
method.
1639-1639
: Usage ofCompressedDataModel.insert_or_update_compressed_data_records
correctly replaces the previousJsonDataModel
method.
@@ -156,7 +156,7 @@ def _get_data_object_from_bpmn_process( | |||
bpmn_process_guid: str | None, | |||
process_instance: ProcessInstanceModel, | |||
) -> Any: | |||
bpmn_process_data = JsonDataModel.find_data_dict_by_hash(bpmn_process.json_data_hash) | |||
bpmn_process_data = CompressedDataModel.find_data_dict_by_hash(bpmn_process.json_data_hash) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tip
Codebase Verification
The search results indicate that there are still references to JsonDataModel
in the codebase, specifically in the load_database_models.py
and json_data.py
files. These references need to be updated to CompressedDataModel
.
spiffworkflow-backend/src/spiffworkflow_backend/load_database_models.py
spiffworkflow-backend/src/spiffworkflow_backend/models/json_data.py
Analysis chain
Change from JsonDataModel
to CompressedDataModel
correctly implemented.
Scripts Executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify that all references to `JsonDataModel` have been replaced with `CompressedDataModel` in the entire codebase.
# Test: Search for the old model usage. Expect: No occurrences of `JsonDataModel`.
rg --type python $'JsonDataModel'
Length of output: 63
Script:
#!/bin/bash
# Description: Verify that all references to `JsonDataModel` have been replaced with `CompressedDataModel` in the entire codebase.
# Test: Search for the old model usage. Expect: No occurrences of `JsonDataModel`.
rg --glob '*.py' 'JsonDataModel'
Length of output: 1179
Closing for now since we may no be going this route. |
Supports #1567
Adds compression to task data so it takes up less space. Adds a new table
compressed_data
that will hold this data and moves usage away fromjson_data
to this table.Summary by CodeRabbit
New Features
CompressedDataModel
for efficient data storage and retrieval.Enhancements
JsonDataModel
withCompressedDataModel
across various modules, improving data handling performance.Bug Fixes
CompressedDataModel
.Code Refactor
JsonDataModel
references and integratingCompressedDataModel
methods.