Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only show data stores from upsearch in the properties panel #1116

Merged
merged 5 commits into from
Feb 27, 2024
Merged

Conversation

jbirddog
Copy link
Contributor

@jbirddog jbirddog commented Feb 27, 2024

Work on #1041 - previous all data stores in the system were displayed in the properties panel. This fix passes in the location of the parent process group for upsearching. Also added a couple make targets for easier access to the front/backend logs when running via the dev docker containers.

Summary by CodeRabbit

  • New Features

    • Added log display commands for backend and frontend containers.
    • Enhanced data store retrieval to support multiple process group identifiers, improving data querying flexibility.
    • Improved the process model editing interface to dynamically include process group identifiers in API calls, enhancing user experience.
  • Refactor

    • Updated methods across various data stores to accept lists of process group identifiers, streamlining data filtering and access.
    • Simplified condition checks in data store retrieval methods for better code clarity.
  • Chores

    • Updated the Makefile to include new log display targets for better development and debugging experience.

Copy link
Contributor

coderabbitai bot commented Feb 27, 2024

Walkthrough

Walkthrough

These updates enhance the logging capabilities for backend and frontend containers and expand the data store management system in the SpiffWorkflow backend. The changes now support handling multiple process group identifiers simultaneously, boosting flexibility and efficiency in data store operations. Moreover, the frontend now dynamically integrates the process group identifier into API calls, streamlining communication between frontend and backend components.

Changes

File(s) Summary
Makefile Added be-logs and fe-logs targets for displaying container logs; updated .PHONY targets.
.../data_stores/crud.py, .../data_stores/json.py, .../data_stores/kkv.py, .../data_stores/typeahead.py Updated existing_data_stores method to accept a list of process_group_identifiers for enhanced filtering.
.../routes/data_store_controller.py Improved data store retrieval with UpsearchService import and location-based upsearch.
.../src/routes/ProcessModelEditDiagram.tsx Adjusted onDataStoresRequested to dynamically incorporate processGroupIdentifier in API calls.

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share

Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit-tests for this file.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit tests for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository from git and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit tests.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

CodeRabbit Discord Community

Join our Discord Community to get help, request features, and share feedback.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Configuration used: .coderabbit.yaml

Commits Files that changed from the base of the PR and between b0dbdfb and 93f4f22.
Files selected for processing (7)
  • Makefile (3 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/data_stores/crud.py (1 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/data_stores/json.py (1 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/data_stores/kkv.py (1 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/data_stores/typeahead.py (1 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/routes/data_store_controller.py (2 hunks)
  • spiffworkflow-frontend/src/routes/ProcessModelEditDiagram.tsx (1 hunks)
Additional comments: 10
Makefile (3)
  • 44-46: The addition of the be-logs target is correctly implemented for following backend container logs.
  • 68-70: The addition of the fe-logs target is correctly implemented for following frontend container logs.
  • 91-92: The update to the .PHONY target list to include be-logs and fe-logs is correctly done, ensuring these targets are treated as commands.
spiffworkflow-backend/src/spiffworkflow_backend/data_stores/crud.py (1)
  • 27-27: The change in the existing_data_stores method signature to accept a list of process group identifiers aligns with the PR's objectives and is correctly implemented.
spiffworkflow-backend/src/spiffworkflow_backend/data_stores/typeahead.py (1)
  • 17-20: The update to the existing_data_stores method to accept a list of process_group_identifiers and the simplified conditional logic are correctly implemented, preparing for future enhancements.
spiffworkflow-backend/src/spiffworkflow_backend/routes/data_store_controller.py (2)
  • 15-15: The import of UpsearchService is correctly added to support the upsearch mechanism for data store visibility.
  • 27-36: The modifications in the data_store_list function to include locations for upsearch when querying data stores are correctly implemented, aligning with the PR's objectives.
spiffworkflow-backend/src/spiffworkflow_backend/data_stores/json.py (1)
  • 33-38: The update to the existing_data_stores method to accept a list of process_group_identifiers and the updated filtering logic are correctly implemented, aligning with the PR's objectives.
spiffworkflow-backend/src/spiffworkflow_backend/data_stores/kkv.py (1)
  • 31-36: The update to the existing_data_stores method to accept a list of process_group_identifiers and the updated filtering logic are correctly implemented, aligning with the PR's objectives.
spiffworkflow-frontend/src/routes/ProcessModelEditDiagram.tsx (1)
  • 383-386: The logic to extract the processGroupIdentifier and use it in the API call path within the onDataStoresRequested function is a significant improvement for fetching relevant data stores based on the parent process group. This change aligns with the PR's objective to refine the visibility of data stores within the properties panel. However, consider handling the case where processModel or processModel.parent_groups might be null or undefined to avoid potential runtime errors. A defensive programming approach could enhance robustness.

Additionally, ensure that the backend endpoint /data-stores is equipped to handle the query parameter process_group_identifier efficiently, especially considering the potential for a large number of data stores or complex relationships between process groups and data stores.

However, it's recommended to verify the robustness of the implementation in edge cases, such as when processModel or processModel.parent_groups might not be as expected.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Configuration used: .coderabbit.yaml

Commits Files that changed from the base of the PR and between 93f4f22 and 7741bf5.
Files selected for processing (3)
  • spiffworkflow-backend/src/spiffworkflow_backend/api.yml (1 hunks)
  • spiffworkflow-backend/src/spiffworkflow_backend/routes/data_store_controller.py (2 hunks)
  • spiffworkflow-frontend/src/routes/ProcessModelEditDiagram.tsx (1 hunks)
Files skipped from review as they are similar to previous changes (2)
  • spiffworkflow-backend/src/spiffworkflow_backend/routes/data_store_controller.py
  • spiffworkflow-frontend/src/routes/ProcessModelEditDiagram.tsx
Additional comments: 1
spiffworkflow-backend/src/spiffworkflow_backend/api.yml (1)
  • 2788-2793: The addition of the upsearch query parameter to the /data-stores endpoint is clear and follows the OpenAPI specification standards. It's optional and well-documented, indicating its purpose for performing an upsearch. This change should enhance the API's functionality by providing more flexible querying capabilities for data stores. Ensure that the backend implementation correctly handles this parameter to perform the intended upsearch operation.

@jbirddog jbirddog merged commit d66ea0e into main Feb 27, 2024
22 checks passed
@jbirddog jbirddog deleted the not_all_ds branch February 27, 2024 23:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants