Skip to content

Conversation

@adolfo-ab
Copy link
Contributor

@adolfo-ab adolfo-ab commented Nov 6, 2025

Short description:

Add the DataSciencePipelinesApplication CR to ocp_resources

More details:

This resource governs the lifecycle of AI pipelines in OpenShift AI

What this PR does / why we need it:

I need it for some test cases in RHOAI

Which issue(s) this PR fixes:

N/A

Special notes for reviewer:

N/A

Bug:

N/A

Summary by CodeRabbit

  • New Features
    • Added support for managing Data Science Pipelines Application resources with configurable components including API server, database, storage, and workflow settings.
    • Extended API group definitions to enable resource management for new infrastructure types.

@coderabbitai
Copy link

coderabbitai bot commented Nov 6, 2025

Walkthrough

Introduces a new DataSciencePipelinesApplication resource class extending NamespacedResource with comprehensive configuration parameters. Adds a corresponding API group constant to the Resource class. Includes minor formatting and annotation adjustments across multiple methods for readability.

Changes

Cohort / File(s) Summary
New Data Science Pipelines Application Resource
ocp_resources/data_science_pipelines_application.py
Introduces DataSciencePipelinesApplication class with parameterized __init__ accepting api_server, database, dsp_version, mlmd, mlpipeline_ui, object_storage, persistence_agent, pod_to_pod_tls, scheduled_workflow, and workflow_controller. Implements to_dict() method that builds spec with required object_storage validation and conditionally includes all non-None configuration fields.
API Group Extension & Formatting Adjustments
ocp_resources/resource.py
Adds DATASCIENCEPIPELINESAPPLICATIONS_OPENDATAHUB_IO constant to ApiGroup. Reformats _exchange_code_for_token, _apply_patches, and get() methods across Resource and NamespacedResource classes with multi-line argument layouts and type: ignore annotations; no behavioral changes.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~12 minutes

  • Focus on the to_dict() logic in DataSciencePipelinesApplication to verify correct spec dictionary construction and the object_storage validation flow
  • Confirm the new API group constant matches the expected format and namespace conventions
  • Verify formatting adjustments preserve original method behavior with type annotations

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 37.50% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and specifically identifies the main change: adding a new DataSciencePipelinesApplication custom resource class.
Description check ✅ Passed The description follows the template structure with all required sections completed, providing context about the resource's purpose and the rationale for the PR.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@rh-bot-1
Copy link

rh-bot-1 commented Nov 6, 2025

Report bugs in Issues

Welcome! ??

This pull request will be automatically processed with the following features:

?? Automatic Actions

  • Reviewer Assignment: Reviewers are automatically assigned based on the OWNERS file in the repository root
  • Size Labeling: PR size labels (XS, S, M, L, XL, XXL) are automatically applied based on changes
  • Issue Creation: A tracking issue is created for this PR and will be closed when the PR is merged or closed
  • Pre-commit Checks: pre-commit runs automatically if .pre-commit-config.yaml exists
  • Branch Labeling: Branch-specific labels are applied to track the target branch
  • Auto-verification: Auto-verified users have their PRs automatically marked as verified

?? Available Commands

PR Status Management

  • /wip - Mark PR as work in progress (adds WIP: prefix to title)
  • /wip cancel - Remove work in progress status
  • /hold - Block PR merging (approvers only)
  • /hold cancel - Unblock PR merging
  • /verified - Mark PR as verified
  • /verified cancel - Remove verification status

Review & Approval

  • /lgtm - Approve changes (looks good to me)
  • /approve - Approve PR (approvers only)
  • /automerge - Enable automatic merging when all requirements are met (maintainers and approvers only)
  • /assign-reviewers - Assign reviewers based on OWNERS file
  • /assign-reviewer @username - Assign specific reviewer
  • /check-can-merge - Check if PR meets merge requirements

Testing & Validation

  • /retest tox - Run Python test suite with tox
  • /retest python-module-install - Test Python package installation
  • /retest conventional-title - Validate commit message format
  • /retest all - Run all available tests

Container Operations

  • /build-and-push-container - Build and push container image (tagged with PR number)
    • Supports additional build arguments: /build-and-push-container --build-arg KEY=value

Cherry-pick Operations

  • /cherry-pick <branch> - Schedule cherry-pick to target branch when PR is merged
    • Multiple branches: /cherry-pick branch1 branch2 branch3

Label Management

  • /<label-name> - Add a label to the PR
  • /<label-name> cancel - Remove a label from the PR

? Merge Requirements

This PR will be automatically approved when the following conditions are met:

  1. Approval: /approve from at least one approver
  2. LGTM Count: Minimum 0 /lgtm from reviewers
  3. Status Checks: All required status checks must pass
  4. No Blockers: No WIP, hold, or conflict labels
  5. Verified: PR must be marked as verified (if verification is enabled)

?? Review Process

Approvers and Reviewers

Approvers:

  • myakove
  • rnetser

Reviewers:

  • dbasunag
  • myakove
  • rnetser
Available Labels
  • hold
  • verified
  • wip
  • lgtm
  • approve
  • automerge

?? Tips

  • WIP Status: Use /wip when your PR is not ready for review
  • Verification: The verified label is automatically removed on each new commit
  • Cherry-picking: Cherry-pick labels are processed when the PR is merged
  • Container Builds: Container images are automatically tagged with the PR number
  • Permission Levels: Some commands require approver permissions
  • Auto-verified Users: Certain users have automatic verification and merge privileges

For more information, please refer to the project documentation or contact the maintainers.

@adolfo-ab
Copy link
Contributor Author

/verified

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (2)
ocp_resources/data_science_pipelines_application.py (2)

16-76: LGTM! Well-documented initialization method.

The __init__ method follows the standard resource pattern, properly delegates to the parent class, and includes comprehensive parameter documentation.

The raw string literal r""" on line 30 is unnecessary since the docstring contains no escape sequences. A regular """ would suffice:

-    r"""
+    """

78-116: Well-structured spec construction with correct field mappings.

The method properly delegates to the parent, validates the required object_storage field, and correctly transforms all snake_case parameter names to camelCase spec keys.

The error message at line 83 includes the "self." prefix:

raise MissingRequiredArgumentError(argument="self.object_storage")

This produces: "...pass self.object_storage" in the error message, which is slightly awkward for user-facing output. Consider using just the parameter name:

-            raise MissingRequiredArgumentError(argument="self.object_storage")
+            raise MissingRequiredArgumentError(argument="object_storage")
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between eae5e54 and 212589c.

⛔ Files ignored due to path filters (2)
  • class_generator/schema/__resources-mappings.json.gz is excluded by !**/*.gz, !class_generator/schema/**
  • class_generator/schema/_definitions.json is excluded by !class_generator/schema/**
📒 Files selected for processing (2)
  • ocp_resources/data_science_pipelines_application.py (1 hunks)
  • ocp_resources/resource.py (5 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
ocp_resources/data_science_pipelines_application.py (2)
ocp_resources/resource.py (4)
  • NamespacedResource (1543-1659)
  • ApiGroup (467-579)
  • to_dict (740-744)
  • to_dict (1657-1659)
ocp_resources/exceptions.py (1)
  • MissingRequiredArgumentError (5-10)
🪛 Ruff (0.14.3)
ocp_resources/resource.py

1169-1169: Star-arg unpacking after a keyword argument is strongly discouraged

(B026)


1608-1608: Star-arg unpacking after a keyword argument is strongly discouraged

(B026)

🔇 Additional comments (7)
ocp_resources/resource.py (5)

489-489: LGTM! New API group constant follows naming conventions.

The constant is correctly formatted and aligns with the new DataSciencePipelinesApplication resource.


159-160: LGTM! Comment improves clarity.

Adding the comment explaining the base64-encoded value enhances code maintainability.


862-863: LGTM! Comment adds helpful context.

The comment clarifies the purpose of the update_replace operation.


1165-1171: LGTM! Reformatting improves readability.

The multi-line format makes the method call easier to read. The static analysis hint about star-arg unpacking after keyword arguments is a style preference (B026) rather than a functional issue—the call correctly matches the method signature.


1604-1610: LGTM! Consistent formatting with the Resource class.

The reformatting mirrors the changes in the Resource.get() method and improves consistency. The static analysis hint is a style preference, as noted in the previous comment.

ocp_resources/data_science_pipelines_application.py (2)

1-7: LGTM! Clean imports and proper attribution.

The auto-generated comment provides helpful context, and imports are appropriate for the resource implementation.


9-14: LGTM! Class structure follows the framework pattern.

The class correctly extends NamespacedResource and uses the newly added API group constant.

@rnetser
Copy link
Collaborator

rnetser commented Nov 6, 2025

/approve
/lgtm

@myakove
Copy link
Collaborator

myakove commented Nov 7, 2025

/approve

@myakove
Copy link
Collaborator

myakove commented Nov 7, 2025

/approve

@myakove myakove merged commit 240685d into RedHatQE:main Nov 7, 2025
7 checks passed
SamAlber pushed a commit to SamAlber/openshift-python-wrapper that referenced this pull request Nov 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants