LCORE-791: konflux hermetic build#68
Conversation
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. WalkthroughAdds hermetic prefetch and optional source-image build parameters to Tekton pipelines; introduces pinned/hash‑based Python/wheel manifests, Ruby/Bundler support, RPM repo/lock artifacts, a konflux requirements generator script, and a parameterized Containerfile/build-args for multi‑arch builds. (33 words) Changes
Sequence Diagram(s)sequenceDiagram
participant Dev as Developer
participant Git as Git/PR
participant Script as KonfluxScript
participant Tekton as Tekton Pipeline
participant Prefetch as Prefetch Task
participant Builder as Multi-arch Builder
participant Registry as Image Registry
Dev->>Git: push changes (Containerfile, scripts, manifests, Tekton YAMLs)
Git->>Script: run scripts/konflux_requirements.sh (generate pinned/hash manifests)
Script-->>Git: commit generated manifests & patched PipelineRuns
Git->>Tekton: trigger PipelineRun (includes prefetch-input, build-source-image, hermetic)
Tekton->>Prefetch: execute hermetic prefetch (rpm, pip, bundler)
Prefetch-->>Tekton: return artifacts, lockfiles, hashes
Tekton->>Builder: run build matrix (x86_64, aarch64, optional source image)
Builder->>Registry: push images
Registry-->>Builder: return digests/status
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@rpms.in.yaml`:
- Around line 1-11: The packages list in rpms.in.yaml is missing rubygem-bundler
even though the Containerfile installs it and rpms.lock.yaml contains it; update
the packages array to include rubygem-bundler alongside rubygems (i.e., add the
symbol "rubygem-bundler" to the packages list) so the input manifest explicitly
declares the bundler RPM for hermetic builds and matches the lockfile and
Containerfile.
🧹 Nitpick comments (4)
pyproject.toml (1)
90-91: Align pip pin with the hermetic toolchain (or document the split)Dev deps pin pip==24.3.1 while the hermetic toolchain pins pip==25.3. Divergent resolver behavior can yield different lock outputs. Consider aligning these or adding a short note explaining the intentional split.
Containerfile (1)
13-13: Consider pinninguvto an exact version for hermetic builds.For a hermetic build aiming at reproducibility, using
uv>=0.7.20allows any newer version to be installed. Pin to an exact version (e.g.,uv==0.7.20) to ensure consistent builds.Suggested fix
-RUN pip3.12 install uv>=0.7.20 +RUN pip3.12 install uv==0.7.20requirements.overrides.txt (1)
1-11: Add trailing newline for POSIX compliance.The file is missing a trailing newline. Many tools expect text files to end with a newline character.
Suggested fix
faiss-cpu==1.12.0 +Makefile (1)
104-105: Add.PHONYdeclaration for consistency.Other targets in this Makefile use
.PHONYdeclarations. For consistency and to prevent issues if a file namedkonflux-requirementsis ever created, add the phony declaration.Suggested fix
+.PHONY: konflux-requirements konflux-requirements: ## generate hermetic requirements.*.txt file for konflux build ./scripts/konflux_requirements.sh
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@Containerfile`:
- Line 13: The RUN pip install line uses an unquoted package spec containing ">"
so the shell treats ">" as redirection; update the Dockerfile RUN command that
currently says RUN pip3.12 install uv>=0.7.20 to quote or escape the version
spec (e.g., use single or double quotes around uv>=0.7.20 or escape the >) so
the version constraint is preserved and no file is created by shell redirection.
In `@requirements-build.txt`:
- Around line 65-70: The requirements file contains conflicting pins for
hatchling (hatchling==1.26.3 and hatchling==1.28.0); resolve the duplicate
requirement by choosing a single hatchling version and updating the pins
accordingly (either bump dependencies that require 1.26.3 or downgrade those
requiring 1.28.0) so only one hatchling spec remains (remove or replace the
other occurrence of hatchling==1.26.3 or hatchling==1.28.0); ensure any direct
consumers (e.g., hatch-fancy-pypi-readme, openai, banks, bracex, cibuildwheel,
filelock, fsspec) are compatible with the chosen version before committing.
🧹 Nitpick comments (1)
Makefile (1)
104-105: Markkonflux-requirementsas phony to avoid file collisions.If a file named
konflux-requirementsappears, this target won’t run. Consider adding a.PHONYentry.♻️ Proposed change
+.PHONY: konflux-requirements konflux-requirements: ## generate hermetic requirements.*.txt file for konflux build ./scripts/konflux_requirements.sh
| hatchling==1.26.3 | ||
| # via | ||
| # hatch-fancy-pypi-readme | ||
| # openai | ||
| hatchling==1.28.0 | ||
| # via |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, check the actual content of requirements-build.txt around lines 65-70
sed -n '60,75p' requirements-build.txtRepository: lightspeed-core/rag-content
Length of output: 355
🏁 Script executed:
# Then run the duplicate detection script mentioned in the review
python - <<'PY'
import re
from collections import Counter
pkgs = []
with open("requirements-build.txt", "r", encoding="utf-8") as f:
for line in f:
line = line.strip()
if not line or line.startswith("#"):
continue
if " @ " in line:
name = line.split(" @ ", 1)[0]
else:
name = re.split(r"[<=>]", line, 1)[0]
pkgs.append(name.lower())
dups = [p for p, c in Counter(pkgs).items() if c > 1]
print("Duplicates found:", ", ".join(dups) if dups else "none")
print("\nFull package counts:")
for pkg, count in sorted(Counter(pkgs).items()):
if count > 1:
print(f" {pkg}: {count} occurrences")
PYRepository: lightspeed-core/rag-content
Length of output: 147
🏁 Script executed:
# Also check if there are any other versions of hatchling in the file
grep -n "hatchling" requirements-build.txtRepository: lightspeed-core/rag-content
Length of output: 204
Resolve duplicate hatchling pins to prevent pip failure.
hatchling is pinned to two different versions: 1.26.3 at line 65 (required by hatch-fancy-pypi-readme, openai) and 1.28.0 at line 69 (required by banks, bracex, cibuildwheel, filelock, fsspec). Pip will reject this with a "double requirement given" error. Consolidate to a single version by either upgrading/downgrading the conflicting dependencies or manually resolving the pins to a common version.
🤖 Prompt for AI Agents
In `@requirements-build.txt` around lines 65 - 70, The requirements file contains
conflicting pins for hatchling (hatchling==1.26.3 and hatchling==1.28.0);
resolve the duplicate requirement by choosing a single hatchling version and
updating the pins accordingly (either bump dependencies that require 1.26.3 or
downgrade those requiring 1.28.0) so only one hatchling spec remains (remove or
replace the other occurrence of hatchling==1.26.3 or hatchling==1.28.0); ensure
any direct consumers (e.g., hatch-fancy-pypi-readme, openai, banks, bracex,
cibuildwheel, filelock, fsspec) are compatible with the chosen version before
committing.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@scripts/konflux_requirements.sh`:
- Around line 89-90: The sed substitution that pins maturin only matches
digits/dots and will break for suffixes like .post1 or rc; update the regex in
the sed command that touches BUILD_FILE (the line using sed and the token
"maturin==") to match the entire version token (e.g., use a non-whitespace token
matcher such as [^[:space:]]* or \S* instead of [0-9.]*) so the replacement
always replaces the full version token with maturin==1.10.2.
- Around line 41-86: The script runs `uv pip compile` on $WHEEL_FILE_PYPI even
when it contains only comments, causing failure; add a guard before the `uv pip
compile "$WHEEL_FILE_PYPI" ... > "$WHEEL_HASH_FILE_PYPI"` call that checks for
any non-comment/non-empty package lines (e.g. grep -v '^[[:space:]]*#'
"$WHEEL_FILE_PYPI" | grep -q '\S') and if none are found, skip the compile and
create/write an empty or placeholder $WHEEL_HASH_FILE_PYPI (or touch it) so
downstream steps don’t fail; update the block around the WHEEL_FILE_PYPI compile
invocation to use this conditional and keep the other compile commands
unchanged.
♻️ Duplicate comments (3)
rpms.in.yaml (1)
1-11: Addrubygem-bundlerto the packages list.Bundler is installed in the Containerfile; include the RPM here so the hermetic input manifest matches the build dependencies.
🐛 Proposed fix
rubygems, + rubygem-bundler, python3.12,Containerfile (1)
12-13: Quote the version spec to avoid shell redirection.Unquoted
>is parsed by the shell, so the constraint is dropped and a file is created.🐛 Proposed fix
-RUN pip3.12 install uv>=0.7.20 +RUN pip3.12 install "uv>=0.7.20"requirements-build.txt (1)
65-76: Resolve duplicatehatchlingpins to avoid pip conflicts.Two
hatchlingversions are pinned; pip will reject this. Consolidate to a single compatible version and adjust dependents as needed.🐛 Example consolidation (remove the older pin)
-hatchling==1.26.3 - # via - # hatch-fancy-pypi-readme - # openai
🧹 Nitpick comments (2)
Makefile (1)
104-105: Declarekonflux-requirementsas phony.Avoids conflicts if a file named
konflux-requirementsappears and keeps parity with other utility targets.♻️ Proposed change
+.PHONY: konflux-requirements konflux-requirements: ## generate hermetic requirements.*.txt file for konflux build ./scripts/konflux_requirements.sh.tekton/rag-tool-push.yaml (1)
27-59: Align prefetch binary architectures with build platforms.
build-platformsis x86_64 only, but prefetchbinary.archincludes aarch64. If multi-arch isn’t intended here, this fetches extra artifacts; if it is, consider adding aarch64 tobuild-platforms.♻️ One possible alignment (x86_64-only)
- "arch": "x86_64,aarch64", + "arch": "x86_64",
| echo "# Packages from pypi.org to be fetched as wheels" > "$WHEEL_FILE_PYPI" | ||
| echo "# This file was autogenerated by konflux_requirements.sh" >> "$WHEEL_FILE_PYPI" | ||
|
|
||
| current_package="" | ||
|
|
||
| while IFS= read -r line || [[ -n "$line" ]]; do | ||
| # Check if this is a package line (starts with a letter/digit, not whitespace or #) | ||
| if [[ "$line" =~ ^[a-zA-Z0-9] ]]; then | ||
| current_package="$line" | ||
| # Check if this is a "# from" annotation line | ||
| elif [[ "$line" =~ ^[[:space:]]*#[[:space:]]*from[[:space:]]+(.*) ]]; then | ||
| index_url="${BASH_REMATCH[1]}" | ||
|
|
||
| if [[ -n "$current_package" ]]; then | ||
| if [[ "$index_url" == "https://pypi.org/simple/" ]]; then | ||
| # Extract package name (without version) for comparison | ||
| package_name=$(echo "$current_package" | sed 's/[=<>!].*//') | ||
| if echo ",${PYPI_WHEELS}," | grep -qF ",${package_name},"; then | ||
| echo "$current_package" >> "$WHEEL_FILE_PYPI" | ||
| else | ||
| echo "$current_package" >> "$SOURCE_FILE" | ||
| fi | ||
| elif [[ "$index_url" == "https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple/" ]]; then | ||
| echo "$current_package" >> "$WHEEL_FILE" | ||
| fi | ||
| current_package="" | ||
| fi | ||
| fi | ||
| done < "$RAW_REQ_FILE" | ||
|
|
||
| # replace the list of binary packages in konflux pipeline configuration | ||
| # only the package names, not the versions, delimited by commas | ||
| wheel_packages=$(grep -v "^[#-]" "$WHEEL_FILE" | sed 's/==.*//' | tr '\n' ',' | sed 's/,$//') | ||
| # append extra wheels to the list | ||
| wheel_packages="$wheel_packages,$EXTRA_WHEELS,$PYPI_WHEELS" | ||
| sed -i 's/"packages": "[^"]*"/"packages": "'"$wheel_packages"'"/' .tekton/rag-tool-pull-request.yaml | ||
| sed -i 's/"packages": "[^"]*"/"packages": "'"$wheel_packages"'"/' .tekton/rag-tool-push.yaml | ||
|
|
||
| echo "Packages from pypi.org written to: $SOURCE_FILE ($(wc -l < "$SOURCE_FILE") packages)" | ||
| echo "Packages from console.redhat.com written to: $WHEEL_FILE ($(wc -l < "$WHEEL_FILE") packages)" | ||
|
|
||
|
|
||
| # Use stdout redirection instead of -o flag to work around uv bug where -o reuses stale hashes from existing output file | ||
| uv pip compile "$WHEEL_FILE" --refresh --generate-hashes --index-url https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple/ --python-version 3.12 --emit-index-url --no-deps --no-annotate --universal > "$WHEEL_HASH_FILE" | ||
| uv pip compile "$WHEEL_FILE_PYPI" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$WHEEL_HASH_FILE_PYPI" | ||
| uv pip compile "$SOURCE_FILE" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$SOURCE_HASH_FILE" |
There was a problem hiding this comment.
Guard against empty PyPI wheel lists before compiling hashes.
If none of the PYPI_WHEELS entries are present in the compiled requirements, $WHEEL_FILE_PYPI will contain only comments. uv pip compile typically exits with “No requirements specified,” which breaks the script under set -e even though an empty wheel list is valid. Consider skipping compilation (or writing an empty hash file) when no package lines exist.
🔧 Proposed fix
-uv pip compile "$WHEEL_FILE_PYPI" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$WHEEL_HASH_FILE_PYPI"
+if grep -qEv '^[[:space:]]*($|#)' "$WHEEL_FILE_PYPI"; then
+ uv pip compile "$WHEEL_FILE_PYPI" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$WHEEL_HASH_FILE_PYPI"
+else
+ echo "# No PyPI wheel requirements" > "$WHEEL_HASH_FILE_PYPI"
+fi🤖 Prompt for AI Agents
In `@scripts/konflux_requirements.sh` around lines 41 - 86, The script runs `uv
pip compile` on $WHEEL_FILE_PYPI even when it contains only comments, causing
failure; add a guard before the `uv pip compile "$WHEEL_FILE_PYPI" ... >
"$WHEEL_HASH_FILE_PYPI"` call that checks for any non-comment/non-empty package
lines (e.g. grep -v '^[[:space:]]*#' "$WHEEL_FILE_PYPI" | grep -q '\S') and if
none are found, skip the compile and create/write an empty or placeholder
$WHEEL_HASH_FILE_PYPI (or touch it) so downstream steps don’t fail; update the
block around the WHEEL_FILE_PYPI compile invocation to use this conditional and
keep the other compile commands unchanged.
| # pin maturin to the version available in the Red Hat registry | ||
| sed -i 's/maturin==[0-9.]*/maturin==1.10.2/' "$BUILD_FILE" |
There was a problem hiding this comment.
Make the maturin pin resilient to non-numeric version suffixes.
The current regex only matches digits and dots; if the resolved version includes a suffix (e.g., .post1, rc), the substitution can produce an invalid version string. Match the full token instead.
🔧 Proposed fix
-sed -i 's/maturin==[0-9.]*/maturin==1.10.2/' "$BUILD_FILE"
+sed -i 's/maturin==[^[:space:]]*/maturin==1.10.2/' "$BUILD_FILE"📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # pin maturin to the version available in the Red Hat registry | |
| sed -i 's/maturin==[0-9.]*/maturin==1.10.2/' "$BUILD_FILE" | |
| sed -i 's/maturin==[^[:space:]]*/maturin==1.10.2/' "$BUILD_FILE" |
🤖 Prompt for AI Agents
In `@scripts/konflux_requirements.sh` around lines 89 - 90, The sed substitution
that pins maturin only matches digits/dots and will break for suffixes like
.post1 or rc; update the regex in the sed command that touches BUILD_FILE (the
line using sed and the token "maturin==") to match the entire version token
(e.g., use a non-whitespace token matcher such as [^[:space:]]* or \S* instead
of [0-9.]*) so the replacement always replaces the full version token with
maturin==1.10.2.
There was a problem hiding this comment.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
.tekton/rag-tool-push.yaml (1)
27-29: Remove aarch64 from prefetch-input arch field or add it to build-platforms.The
build-platformsspecifies onlylinux/x86_64, but theprefetch-inputpip binary configuration requestsx86_64,aarch64. This causes unnecessary prefetching of aarch64 wheels that won't be used in this pipeline. Either align the arch field tox86_64or addlinux-c6gd2xlarge/arm64to build-platforms if multi-arch support is intended.
♻️ Duplicate comments (3)
scripts/konflux_requirements.sh (2)
83-86: Guard against empty requirements files before compiling hashes.If
$WHEEL_FILE_PYPIor other files contain only comments/headers,uv pip compilewill fail with "No requirements specified", causing the script to exit underset -e. This concern was previously raised.🔧 Proposed fix
# Use stdout redirection instead of -o flag to work around uv bug where -o reuses stale hashes from existing output file uv pip compile "$WHEEL_FILE" --refresh --generate-hashes --index-url https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple/ --python-version 3.12 --emit-index-url --no-deps --no-annotate --universal > "$WHEEL_HASH_FILE" -uv pip compile "$WHEEL_FILE_PYPI" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$WHEEL_HASH_FILE_PYPI" +if grep -qEv '^[[:space:]]*($|#)' "$WHEEL_FILE_PYPI"; then + uv pip compile "$WHEEL_FILE_PYPI" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$WHEEL_HASH_FILE_PYPI" +else + echo "# No PyPI wheel requirements" > "$WHEEL_HASH_FILE_PYPI" +fi uv pip compile "$SOURCE_FILE" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$SOURCE_HASH_FILE"
89-90: Make the maturin pin resilient to non-numeric version suffixes.The regex
[0-9.]*won't match version suffixes like.post1orrc1. This concern was previously raised.🔧 Proposed fix
-sed -i 's/maturin==[0-9.]*/maturin==1.10.2/' "$BUILD_FILE" +sed -i 's/maturin==[^[:space:]]*/maturin==1.10.2/' "$BUILD_FILE"Containerfile (1)
13-13: Quote the version spec to avoid shell redirection.The unquoted
>=will be parsed by the shell as redirection, creating a file0.7.20and dropping the version constraint. This was previously flagged.🐛 Proposed fix
-RUN pip3.12 install uv>=0.7.20 +RUN pip3.12 install "uv>=0.7.20"
🧹 Nitpick comments (2)
scripts/konflux_requirements.sh (1)
71-77: Consider escaping special characters in sed substitution.If any package name contains characters special to sed (like
/,&, or\), the substitution on lines 76-77 will fail or produce incorrect results. While unlikely with standard package names, this could be made more robust.♻️ Proposed fix using a safer delimiter
-sed -i 's/"packages": "[^"]*"/"packages": "'"$wheel_packages"'"/' .tekton/rag-tool-pull-request.yaml -sed -i 's/"packages": "[^"]*"/"packages": "'"$wheel_packages"'"/' .tekton/rag-tool-push.yaml +# Escape any forward slashes in package names for sed +escaped_packages=$(echo "$wheel_packages" | sed 's/[&/\]/\\&/g') +sed -i 's/"packages": "[^"]*"/"packages": "'"$escaped_packages"'"/' .tekton/rag-tool-pull-request.yaml +sed -i 's/"packages": "[^"]*"/"packages": "'"$escaped_packages"'"/' .tekton/rag-tool-push.yamlMakefile (1)
104-105: Add.PHONYdeclaration forkonflux-requirementstarget.The
konflux-requirementstarget doesn't produce a file of that name, so it should be declared as.PHONYto ensure it always runs when invoked (even if a file namedkonflux-requirementswere to exist).♻️ Proposed fix
+.PHONY: konflux-requirements konflux-requirements: ## generate hermetic requirements.*.txt file for konflux build ./scripts/konflux_requirements.sh
There was a problem hiding this comment.
Actionable comments posted: 4
🤖 Fix all issues with AI agents
In `@Containerfile`:
- Around line 8-10: Remove any invocations of "bundle update --bundler" in the
Containerfile (the RUN that references Ruby/Gems and later RUNs around lines
that touch Gemfile.lock) and stop rewriting Gemfile.lock; instead set bundler
env vars for locked, hermetic installs (for example export or ENV
BUNDLE_DEPLOYMENT=1 and ENV BUNDLE_FROZEN=1) and run bundle install in locked
mode (e.g., "bundle install --local" or "bundle install --jobs=... --retry=..."
so it uses the existing Gemfile.lock without contacting rubygems.org). Ensure
references to "Gemfile.lock" remain unchanged and remove the deprecated
"--deployment --frozen" flags in favor of the environment variables.
In `@Makefile`:
- Around line 104-106: The Makefile's konflux-requirements target lacks a .PHONY
declaration; add konflux-requirements to the Makefile's .PHONY list (or create a
.PHONY line if one doesn't exist) so the konflux-requirements target is always
executed rather than being skipped if a file/dir named konflux-requirements
exists; reference the konflux-requirements target name when updating the .PHONY
entries.
In `@pyproject.toml`:
- Around line 48-51: Update the stale explanatory comment above the pinned
dependencies so it matches the actual pinned versions: replace the line stating
"torch 2.5.x pairs with torchvision 0.20.x" with a comment that reflects the
current pins "torch>=2.8.0,<2.9.0" and "torchvision>=0.23.0,<0.24.0" (e.g.,
"torch 2.8.x pairs with torchvision 0.23.x") so the comment correctly documents
the version pairing.
In `@requirements.overrides.txt`:
- Around line 1-11: The file ends without a trailing newline; update
requirements.overrides.txt to add a single POSIX-compliant newline character at
the end of the file (i.e., ensure the file ends with a '\n' after the last
listed package such as "faiss-cpu==1.12.0") so text tools and linters treat it
as a proper POSIX text file.
♻️ Duplicate comments (2)
requirements-build.txt (1)
65-100: Duplicatehatchlingpins will cause pip installation failure.The file contains two conflicting pins for
hatchling:
- Line 65:
hatchling==1.26.3- Line 69:
hatchling==1.28.0Pip will reject this with a "double requirement given" error. This needs to be resolved by consolidating to a single version that satisfies all dependents.
Containerfile (1)
13-13: Quote the uv version spec to avoid shell redirection.Line 13 treats
>as redirection, which drops the version constraint and creates a stray file. This was flagged previously and still applies.🐛 Proposed fix
-RUN pip3.12 install uv>=0.7.20 +RUN pip3.12 install "uv>=0.7.20"
| konflux-requirements: ## generate hermetic requirements.*.txt file for konflux build | ||
| ./scripts/konflux_requirements.sh | ||
|
|
There was a problem hiding this comment.
Add .PHONY declaration for konflux-requirements target.
The konflux-requirements target is missing a .PHONY declaration. Without it, if a file or directory named konflux-requirements exists, make will skip execution. Other targets in this Makefile correctly use .PHONY.
📝 Suggested fix
+.PHONY: konflux-requirements
konflux-requirements: ## generate hermetic requirements.*.txt file for konflux build
./scripts/konflux_requirements.sh📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| konflux-requirements: ## generate hermetic requirements.*.txt file for konflux build | |
| ./scripts/konflux_requirements.sh | |
| .PHONY: konflux-requirements | |
| konflux-requirements: ## generate hermetic requirements.*.txt file for konflux build | |
| ./scripts/konflux_requirements.sh | |
🤖 Prompt for AI Agents
In `@Makefile` around lines 104 - 106, The Makefile's konflux-requirements target
lacks a .PHONY declaration; add konflux-requirements to the Makefile's .PHONY
list (or create a .PHONY line if one doesn't exist) so the konflux-requirements
target is always executed rather than being skipped if a file/dir named
konflux-requirements exists; reference the konflux-requirements target name when
updating the .PHONY entries.
| # override these package to the version available on RHOAI wheels index: | ||
| # https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple | ||
| transformers==4.57.3 | ||
| tokenizers==0.22.1 | ||
| scipy==1.16.3 | ||
| aiohttp==3.13.2 | ||
| aiosqlite==0.22.0 | ||
| anyio==4.12.0 | ||
| datasets==4.4.1 | ||
| pillow==12.0.0 | ||
| faiss-cpu==1.12.0 No newline at end of file |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -e
echo "=== Check 1: Trailing newline in requirements.overrides.txt ==="
if [ -f "requirements.overrides.txt" ]; then
# Check if file ends with newline
if [ -z "$(tail -c 1 requirements.overrides.txt)" ]; then
echo "✓ File DOES end with newline"
else
echo "✗ File DOES NOT end with newline (missing trailing newline)"
fi
# Show last few bytes for confirmation
echo "Last 20 bytes (hex):"
tail -c 20 requirements.overrides.txt | od -An -tx1
else
echo "✗ File not found"
fi
echo ""
echo "=== Check 2: RHOAI URL accessibility ==="
# Test the RHOAI index URL
URL="https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple/"
HTTP_CODE=$(curl -s -o /tmp/rhoai_response.txt -w "%{http_code}" "$URL" --max-time 10 2>/dev/null || echo "000")
echo "HTTP Status: $HTTP_CODE"
if [ "$HTTP_CODE" = "200" ]; then
echo "✓ RHOAI index is accessible"
echo "First 500 chars of response:"
head -c 500 /tmp/rhoai_response.txt
echo ""
elif [ "$HTTP_CODE" = "000" ]; then
echo "✗ Could not reach URL (network error or timeout)"
else
echo "✗ URL returned HTTP $HTTP_CODE"
fiRepository: lightspeed-core/rag-content
Length of output: 276
🏁 Script executed:
#!/bin/bash
set -e
echo "=== Check 1: Trailing newline confirmation ==="
if [ -f "requirements.overrides.txt" ]; then
# Alternative method to show last bytes without od
echo "File size and last characters:"
wc -c requirements.overrides.txt
echo "Last line:"
tail -1 requirements.overrides.txt
# Check with tail -c
LAST_BYTE=$(tail -c 1 requirements.overrides.txt | xxd -p)
echo "Last byte (hex): $LAST_BYTE"
if [ "$LAST_BYTE" = "0a" ]; then
echo "✓ Ends with newline (0x0a)"
else
echo "✗ Does NOT end with newline (last byte is 0x$LAST_BYTE)"
fi
fi
echo ""
echo "=== Check 2: RHOAI URL accessibility ==="
URL="https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple/"
echo "Testing URL: $URL"
HTTP_CODE=$(curl -s -o /tmp/rhoai_response.txt -w "%{http_code}" "$URL" --max-time 10 2>&1 || echo "000")
echo "HTTP Status: $HTTP_CODE"
if [ "$HTTP_CODE" = "200" ]; then
echo "✓ RHOAI index is accessible"
echo "Response size: $(wc -c < /tmp/rhoai_response.txt) bytes"
echo "Sample of response:"
head -c 800 /tmp/rhoai_response.txt
echo ""
else
echo "Response: $HTTP_CODE"
fiRepository: lightspeed-core/rag-content
Length of output: 270
🏁 Script executed:
#!/bin/bash
echo "=== Check 1: Trailing newline verification ==="
if [ -f "requirements.overrides.txt" ]; then
# Use Python to check last byte since xxd is not available
python3 << 'PYTHON_EOF'
with open('requirements.overrides.txt', 'rb') as f:
content = f.read()
if len(content) > 0:
last_byte = content[-1]
print(f"File size: {len(content)} bytes")
print(f"Last byte: 0x{last_byte:02x} ({repr(chr(last_byte))})")
if last_byte == 0x0a:
print("✓ File DOES end with newline (0x0a)")
else:
print("✗ File DOES NOT end with newline")
else:
print("File is empty")
PYTHON_EOF
fi
echo ""
echo "=== Check 2: Verify file content ==="
tail -3 requirements.overrides.txt
echo "[END OF FILE]"
echo ""
echo "=== Check 3: RHOAI URL accessibility ==="
URL="https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple/"
echo "Testing URL: $URL"
python3 << 'PYTHON_EOF'
import urllib.request
import ssl
url = "https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple/"
try:
# Create unverified SSL context (for sandbox)
context = ssl.create_default_context()
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
with urllib.request.urlopen(url, context=context, timeout=10) as response:
status = response.status
print(f"HTTP Status: {status}")
print("✓ URL is accessible")
body = response.read(500).decode('utf-8', errors='ignore')
print(f"Sample response: {body[:300]}")
except urllib.error.URLError as e:
print(f"✗ URL error: {e}")
except Exception as e:
print(f"✗ Error: {e}")
PYTHON_EOFRepository: lightspeed-core/rag-content
Length of output: 778
🏁 Script executed:
#!/bin/bash
echo "=== Check 4: Verify package versions on RHOAI index ==="
python3 << 'PYTHON_EOF'
import urllib.request
import ssl
import json
import re
# Create unverified SSL context
context = ssl.create_default_context()
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
base_url = "https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple"
packages_to_check = [
("transformers", "4.57.3"),
("tokenizers", "0.22.1"),
("scipy", "1.16.3"),
("aiohttp", "3.13.2"),
("pillow", "12.0.0"),
("faiss-cpu", "1.12.0"),
]
print("Checking package availability on RHOAI index...\n")
for package_name, version in packages_to_check:
url = f"{base_url}/{package_name}/"
try:
with urllib.request.urlopen(url, context=context, timeout=10) as response:
body = response.read().decode('utf-8')
# Look for the specific version
if f"{package_name}-{version}" in body or f"{version}" in body:
print(f"✓ {package_name}=={version}: Found on index")
else:
print(f"✗ {package_name}=={version}: NOT found on index")
# Show what versions are available
versions = re.findall(r'>([^<]+)\/<', body)
if versions:
print(f" Available versions: {versions[:5]}")
except urllib.error.HTTPError as e:
if e.code == 404:
print(f"✗ {package_name}: Package not found on index (HTTP 404)")
else:
print(f"✗ {package_name}: HTTP {e.code}")
except Exception as e:
print(f"✗ {package_name}: Error - {e}")
PYTHON_EOFRepository: lightspeed-core/rag-content
Length of output: 386
Add trailing newline for POSIX compliance.
The file is missing a trailing newline. POSIX text files should end with a newline character to ensure compatibility with various tools.
🤖 Prompt for AI Agents
In `@requirements.overrides.txt` around lines 1 - 11, The file ends without a
trailing newline; update requirements.overrides.txt to add a single
POSIX-compliant newline character at the end of the file (i.e., ensure the file
ends with a '\n' after the last listed package such as "faiss-cpu==1.12.0") so
text tools and linters treat it as a proper POSIX text file.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In @.tekton/rag-tool-push.yaml:
- Around line 34-65: The prefetch-input config contains an invalid bundler field
and a mis-typed python version: remove the unsupported "allow_binary" entry from
the "bundler" object (the bundler block shown) and change the "py_version" value
inside the "binary" object from a quoted string to an integer (e.g., 312) so the
"binary" section uses an integer py_version and the bundler no longer contains
allow_binary.
♻️ Duplicate comments (5)
pyproject.toml (1)
48-51: Stale comment: version pairing no longer matches code.The comment on line 49 states "torch 2.5.x pairs with torchvision 0.20.x" but the pins are now
torch>=2.8.0,<2.9.0andtorchvision>=0.23.0,<0.24.0. Update the comment accordingly.rpms.in.yaml (1)
1-11: Addrubygem-bundlerto the packages list.The Containerfile installs
rubygem-bundler, and the PR enables Bundler-based hermetic builds. For consistency and to ensure all required RPMs are declared, addrubygem-bundlerto the packages list.requirements-build.txt (1)
65-101: Resolve duplicatehatchlingpins to prevent pip failure.
hatchlingis pinned twice with conflicting versions:1.26.3(line 65) and1.28.0(line 69). Pip will reject this with a "double requirement given" error. This needs to be resolved before the hermetic build can succeed.Containerfile (2)
13-13: Quote the uv version spec to avoid shell redirection.The shell interprets
>as redirection, which drops the constraint and creates a stray file.🐛 Proposed fix
-RUN pip3.12 install uv>=0.7.20 +RUN pip3.12 install "uv>=0.7.20"
23-26: Avoidbundle update --bundlerin the image build (breaks hermeticity).It rewrites
Gemfile.lockand requires network access. Prefer a locked install using the existing lockfile.🔧 Suggested adjustment
-ENV BUNDLE_PATH__SYSTEM=true -RUN bundle update --bundler -RUN bundle install +ENV BUNDLE_PATH__SYSTEM=true \ + BUNDLE_DEPLOYMENT=1 \ + BUNDLE_FROZEN=1 +RUN bundle install
🧹 Nitpick comments (3)
pyproject.toml (1)
90-91: Consider aligning pip versions across files.
pip==24.3.1is pinned here in dev dependencies, whilerequirements.hermetic.txtspecifiespip==25.3. If both are used in the build process, this inconsistency could cause confusion or unexpected behavior. Consider using a consistent version across all requirement specifications..tekton/rag-tool-pull-request.yaml (1)
59-64: Long binary packages list reduces maintainability.The
packagesstring on line 60 contains 100+ comma-separated package names in a single line, making it difficult to review, diff, and maintain. Consider formatting this as a proper JSON array across multiple lines or managing the list in a separate file that gets injected during pipeline generation..tekton/rag-tool-push.yaml (1)
27-29: Align prefetch arch list with build platforms.
build-platformsislinux/x86_64, but prefetch requestsx86_64,aarch64. If multi‑arch is intended, addlinux/aarch64tobuild-platforms; otherwise dropaarch64from prefetch to avoid unused downloads.Also applies to: 58-59
| - name: prefetch-input | ||
| # no source available: torch, faiss-cpu | ||
| # hermeto prefetch problems: uv, pip, jiter, tiktoken, | ||
| # those need cmake to build: pyarrow | ||
| # those need cargo to build: jiter, tiktoken, cryptography, fastuuid, hf_xet, maturin, pydantic_core, rpds_py, safetensors, tokenizers | ||
| # to accelerate build:numpy, scipy, pandas, pillow, scikit_learn | ||
| value: | | ||
| [ | ||
| { | ||
| "type": "rpm", | ||
| "path": "." | ||
| }, | ||
| { | ||
| "type": "pip", | ||
| "path": ".", | ||
| "requirements_files": [ | ||
| "requirements.hashes.wheel.txt", | ||
| "requirements.hashes.source.txt", | ||
| "requirements.hermetic.txt" | ||
| ], | ||
| "requirements_build_files": ["requirements-build.txt"], | ||
| "binary": { | ||
| "packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,anyio,asyncpg,attrs,beautifulsoup4,cffi,chardet,charset-normalizer,click,colorama,cryptography,dataclasses-json,defusedxml,distro,docling-ibm-models,einops,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,googleapis-common-protos,greenlet,h11,hf-xet,httpcore,httpx,huggingface-hub,idna,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema-specifications,latex2mathml,llama-stack-client,lxml,markdown-it-py,markupsafe,mdurl,mpire,mpmath,multidict,mypy-extensions,nest-asyncio,networkx,nltk,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,packaging,pandas,pillow,platformdirs,pluggy,prompt-toolkit,propcache,psycopg2-binary,pyaml,pydantic,pydantic-core,pydantic-settings,pygments,pyjwt,pylatexenc,python-dateutil,python-docx,python-dotenv,python-multipart,python-pptx,pytz,pyyaml,referencing,requests,rich,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,sentence-transformers,shapely,shellingham,six,sniffio,starlette,sympy,tabulate,tenacity,threadpoolctl,tiktoken,tokenizers,torch,torchvision,tqdm,transformers,triton,typer,typing-extensions,typing-inspect,typing-inspection,tzdata,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,opencv-python,rapidocr,sqlite-vec", | ||
| "os": "linux", | ||
| "arch": "x86_64,aarch64", | ||
| "py_version": "312" | ||
| } | ||
| }, | ||
| { | ||
| "type": "bundler", | ||
| "allow_binary": "true" | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's examine the file context around the problematic area
cd / && find . -name "rag-tool-push.yaml" -type f 2>/dev/null | head -5Repository: lightspeed-core/rag-content
Length of output: 117
🏁 Script executed:
# Search for cachi2 or prefetch references in the repo
rg "prefetch-dependencies-oci-ta|cachi2|prefetch-input" --type yaml -B 2 -A 5 2>/dev/null | head -100Repository: lightspeed-core/rag-content
Length of output: 53
🏁 Script executed:
# Look for any configuration or documentation about prefetch schema
fd "cachi2|prefetch" -t f 2>/dev/null | head -20Repository: lightspeed-core/rag-content
Length of output: 111
🏁 Script executed:
# Check the actual structure of the YAML file to understand context
if [ -f ".tekton/rag-tool-push.yaml" ]; then
head -80 .tekton/rag-tool-push.yaml
fiRepository: lightspeed-core/rag-content
Length of output: 4839
🌐 Web query:
cachi2 prefetch-dependencies-oci-ta schema documentation json format
💡 Result:
For the Konflux task-prefetch-dependencies-oci-ta (which runs Cachi2/Hermeto), the input parameter is a JSON string. In practice it is an object with:
packages: array of package specsflags: optional array of strings (extra behavior flags)
Example seen in a real PipelineRun attestation:
{
"packages": [
{ "path": "yq", "type": "gomod" },
{ "type": "rpm" }
],
"flags": ["gomod-vendor"]
}[1]
Package object shape (what goes inside packages[])
Each entry is a “package manager input” object. For example, for Go modules Hermeto documents:
{ "type": "gomod", "path": "." }(path is relative to the source root and defaults to ".".) [2]
Hermeto has similar per-package-manager pages (npm/pip/yarn/rpm/etc.) describing the allowed keys for each type. [3]
Sources: [1] [2] [3]
🏁 Script executed:
cat scripts/generate_packages_to_prefetch.pyRepository: lightspeed-core/rag-content
Length of output: 7106
🌐 Web query:
cachi2 Hermeto pip binary packages arch py_version allow_binary schema format
💡 Result:
In Hermeto (the tool formerly packaged as cachi2), pip “binary packages” selection is configured via the binary object in the pip JSON input. There isn’t an allow_binary field in the current pip input schema; instead, you provide (or omit) binary. [1]
JSON input shape (pip)
{
"type": "pip",
"path": ".",
"requirements_files": ["requirements.txt"],
"requirements_build_files": ["requirements-build.txt"],
"binary": {
"packages": "tensorflow",
"arch": "x86_64",
"os": "linux",
"py_version": 312
}
}[1]
binary filter fields (schema/format)
packages: comma-separated package names; special:all:means “all packages” (default:all:).arch: comma-separated architectures (default"x86_64").os: comma-separated OSes (default"linux").py_version: single integerXYfor PythonX.Y(example:312for Python 3.12; defaultNone).
(Additional fields exist:py_impl,abi,platform.) [2]
Also, Hermeto describes three modes via binary presence/value: omit binary (no wheels), set packages=":all:" (prefer wheels, fall back to sdists), or set packages to specific names (only wheels for those packages; fail if no compatible wheel). [2]
Sources: Hermeto pip docs (JSON input + binary filters). [1] [2]
🌐 Web query:
Cachi2 pip package manager binary field specification
💡 Result:
For Cachi2/Hermeto’s pip package manager input, the optional binary field is a “binary (wheel) filter” object used to control which wheels are prefetched (otherwise Cachi2/Hermeto defaults to fetching sdists only). [1]
Where binary appears (pip JSON input)
{
"type": "pip",
"path": ".",
"requirements_files": ["requirements.txt"],
"requirements_build_files": ["requirements-build.txt"],
"binary": { ... }
}[1]
Behavior / “strategies” controlled by binary.packages
- Default (no binaries): omit
binary⇒ source-only (sdists only). [1] - Prefer binaries (like
pip --prefer-binary): set"binary": {"packages": ":all:"}⇒ try compatible wheels for everything, fall back to sdists if none match. [1] - Only binaries for specific packages (like
pip --only-binaryfor those pkgs): set"packages"to specific names (e.g."numpy,pandas") ⇒ for those packages, fail if no matching wheel exists; others remain sdist-only. [1]
binary filter object fields
packages: comma-separated package names to apply binary filtering to. Default:all:. [1]arch: comma-separated architectures. Default"x86_64". [1]os: comma-separated OS values. Default"linux". [1]py_version: integerMAJORMINOR(e.g.312for Python 3.12). DefaultNone. [1]py_impl: comma-separated Python implementations. Default"cp". [1]abi: comma-separated ABIs. Default:all:. [1]platform: regex matched against wheel platform tags. DefaultNone. [1]
Filter logic rules
:all:orNone⇒ no filtering for that field. [1]- Multiple values within a field use OR; multiple fields combine with AND. [1]
arch/osare mutually exclusive withplatform. [1]
Source: Hermeto/Cachi2 pip documentation page. [1]
References:
[1] (hermetoproject.github.io)
Citations:
Remove unsupported allow_binary field and fix py_version type in binary configuration.
The allow_binary field is not part of Hermeto's pip binary filter schema and must be removed. Additionally, py_version must be an integer (e.g., 312 for Python 3.12), not a quoted string. The packages and arch fields are correctly formatted as comma-separated strings per spec.
Required fixes
"binary": {
"packages": "accelerate,aiohappyeyeballs,aiohttp,...",
"os": "linux",
"arch": "x86_64,aarch64",
- "py_version": "312"
+ "py_version": 312
}Also remove "allow_binary": "true" from the bundler block.
🤖 Prompt for AI Agents
In @.tekton/rag-tool-push.yaml around lines 34 - 65, The prefetch-input config
contains an invalid bundler field and a mis-typed python version: remove the
unsupported "allow_binary" entry from the "bundler" object (the bundler block
shown) and change the "py_version" value inside the "binary" object from a
quoted string to an integer (e.g., 312) so the "binary" section uses an integer
py_version and the bundler no longer contains allow_binary.
ca7a0a1 to
9169da9
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In @.tekton/rag-tool-pull-request.yaml:
- Around line 38-69: The prefetch-input pip binary schema uses a string for
"py_version" and an unsupported bundler field "allow_binary"; update the pip
block's "py_version" value from the quoted string to an integer (py_version:
312) and remove the "allow_binary" key from the bundler block (refer to the
"prefetch-input" object, the "py_version" field inside the pip -> binary map,
and the "bundler" object) so the schema matches Hermeto's expected types.
♻️ Duplicate comments (4)
pyproject.toml (1)
48-51: Update the stale torch/torchvision pairing comment.Line 49 still references the old pairing, which no longer matches the pinned versions.
✏️ Proposed fix
- # torch 2.5.x pairs with torchvision 0.20.x + # torch 2.8.x pairs with torchvision 0.23.xPyTorch 2.8.0 torchvision 0.23.0 compatibility CPU wheelsContainerfile (2)
11-11: Quote the uv version spec to avoid shell redirection.Line 11 uses
>unquoted, so the shell treats it as redirection.🛠️ Proposed fix
-RUN pip3.12 install uv>=0.7.20 +RUN pip3.12 install "uv>=0.7.20"
21-22: Use locked Bundler install for hermetic builds.Line 22 runs
bundle installwithout frozen/deployment mode; this can update the lockfile or reach the network. Prefer locked, offline-friendly install.🔒 Proposed fix
-RUN BUNDLE_PATH__SYSTEM=true bundle install +RUN BUNDLE_PATH__SYSTEM=true BUNDLE_DEPLOYMENT=1 BUNDLE_FROZEN=1 bundle install --localBundler BUNDLE_DEPLOYMENT BUNDLE_FROZEN bundle install --local behavior.tekton/rag-tool-push.yaml (1)
34-65: Fix Hermeto prefetch schema: removeallow_binaryand use integerpy_version.Same issue as above:
py_versionshould be an integer, and the bundler input doesn’t supportallow_binary.🛠️ Proposed fix
"binary": { "packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,anyio,asyncpg,attrs,beautifulsoup4,cffi,chardet,charset-normalizer,click,colorama,cryptography,dataclasses-json,defusedxml,distro,docling-ibm-models,einops,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,googleapis-common-protos,greenlet,h11,hf-xet,httpcore,httpx,huggingface-hub,idna,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema-specifications,latex2mathml,llama-stack-client,lxml,markdown-it-py,markupsafe,mdurl,mpire,mpmath,multidict,mypy-extensions,nest-asyncio,networkx,nltk,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,packaging,pandas,pillow,platformdirs,pluggy,prompt-toolkit,propcache,psycopg2-binary,pyaml,pydantic,pydantic-core,pydantic-settings,pygments,pyjwt,pylatexenc,python-dateutil,python-docx,python-dotenv,python-multipart,python-pptx,pytz,pyyaml,referencing,requests,rich,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,sentence-transformers,shapely,shellingham,six,sniffio,starlette,sympy,tabulate,tenacity,threadpoolctl,tiktoken,tokenizers,torch,torchvision,tqdm,transformers,triton,typer,typing-extensions,typing-inspect,typing-inspection,tzdata,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,opencv-python,rapidocr,sqlite-vec", "os": "linux", "arch": "x86_64,aarch64", - "py_version": "312" + "py_version": 312 } }, { "type": "bundler", - "allow_binary": "true" }Hermeto cachi2 pip binary schema py_version integer allow_binary bundler
| - name: prefetch-input | ||
| # no source available: torch, faiss-cpu | ||
| # hermeto prefetch problems: uv, pip, jiter, tiktoken, | ||
| # those need cmake to build: pyarrow | ||
| # those need cargo to build: jiter, tiktoken, cryptography, fastuuid, hf_xet, maturin, pydantic_core, rpds_py, safetensors, tokenizers | ||
| # to accelerate build:numpy, scipy, pandas, pillow, scikit_learn | ||
| value: | | ||
| [ | ||
| { | ||
| "type": "rpm", | ||
| "path": "." | ||
| }, | ||
| { | ||
| "type": "pip", | ||
| "path": ".", | ||
| "requirements_files": [ | ||
| "requirements.hashes.wheel.txt", | ||
| "requirements.hashes.source.txt", | ||
| "requirements.hermetic.txt" | ||
| ], | ||
| "requirements_build_files": ["requirements-build.txt"], | ||
| "binary": { | ||
| "packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,anyio,asyncpg,attrs,beautifulsoup4,cffi,chardet,charset-normalizer,click,colorama,cryptography,dataclasses-json,defusedxml,distro,docling-ibm-models,einops,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,googleapis-common-protos,greenlet,h11,hf-xet,httpcore,httpx,huggingface-hub,idna,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema-specifications,latex2mathml,llama-stack-client,lxml,markdown-it-py,markupsafe,mdurl,mpire,mpmath,multidict,mypy-extensions,nest-asyncio,networkx,nltk,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,packaging,pandas,pillow,platformdirs,pluggy,prompt-toolkit,propcache,psycopg2-binary,pyaml,pydantic,pydantic-core,pydantic-settings,pygments,pyjwt,pylatexenc,python-dateutil,python-docx,python-dotenv,python-multipart,python-pptx,pytz,pyyaml,referencing,requests,rich,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,sentence-transformers,shapely,shellingham,six,sniffio,starlette,sympy,tabulate,tenacity,threadpoolctl,tiktoken,tokenizers,torch,torchvision,tqdm,transformers,triton,typer,typing-extensions,typing-inspect,typing-inspection,tzdata,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,opencv-python,rapidocr,sqlite-vec", | ||
| "os": "linux", | ||
| "arch": "x86_64,aarch64", | ||
| "py_version": "312" | ||
| } | ||
| }, | ||
| { | ||
| "type": "bundler", | ||
| "allow_binary": "true" | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Hermeto cachi2 pip binary schema py_version integer type
💡 Result:
In Hermeto’s (Cachi2-derived) pip “binary” filter object, py_version is an integer: it must be the Python major and minor combined, e.g. 312 for Python 3.12 (and it’s shown in the JSON input example as py_version: 312). [1]
Example
{
"type": "pip",
"binary": { "os": "linux", "arch": "aarch64", "py_version": 312 }
}[1]
Source(s):
[1] (hermetoproject.github.io)
Citations:
🌐 Web query:
cachi2 bundler allow_binary parameter support
💡 Result:
allow_binary is not a supported parameter for the bundler package manager in cachi2 (Hermeto). The public docs describe Bundler support only in terms of reading Gemfile.lock/Gemfile and fetching the declared gems; they do not document any allow_binary option for Bundler input/config. [1]
Implication: if you need to control “binary vs source” behavior for Ruby gems, that must be done via Bundler/RubyGems configuration in the build (e.g., Bundler settings like force_ruby_platform), not via a cachi2 bundler/allow_binary parameter. [1]
Sources: [1]
🌐 Web query:
Hermeto prefetch pip schema requirements
💡 Result:
Hermeto’s pip prefetch (“hermeto fetch-deps”) accepts a JSON object with this schema (keys and defaults):
type: must be"pip"(required) (hermetoproject.github.io)path: path to the Python package dir, relative to--source(default".") (hermetoproject.github.io)requirements_files: list of requirement lockfiles relative topath(default["requirements.txt"], or[]if missing) (hermetoproject.github.io)requirements_build_files: build-requirements lockfiles relative topath(default["requirements-build.txt"], or[]if missing) (hermetoproject.github.io)binary(optional): wheel-selection filter object (defaults to none; if omitted Hermeto fetches sdists by default): (hermetoproject.github.io)packages(e.g.,"tensorflow"),arch(e.g.,"x86_64"),os(e.g.,"linux"),py_version(e.g.,312) (hermetoproject.github.io)
Requirements file expectations: the requirements*.txt files must be “fully resolved” lockfiles (include transitive deps and pin exact versions). (hermetoproject.github.io)
Citations:
- 1: https://hermetoproject.github.io/hermeto/pip/
- 2: https://hermetoproject.github.io/hermeto/pip/
- 3: https://hermetoproject.github.io/hermeto/pip/
- 4: https://hermetoproject.github.io/hermeto/pip/
- 5: https://hermetoproject.github.io/hermeto/pip/
- 6: https://hermetoproject.github.io/hermeto/pip/
- 7: https://hermetoproject.github.io/hermeto/pip/
Fix Hermeto prefetch schema: change py_version to integer and remove unsupported allow_binary.
The Hermeto pip binary schema requires py_version as an integer (e.g., 312 not "312"), and allow_binary is not a supported parameter for the bundler block.
Proposed fix
"binary": {
"packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,anyio,asyncpg,attrs,beautifulsoup4,cffi,chardet,charset-normalizer,click,colorama,cryptography,dataclasses-json,defusedxml,distro,docling-ibm-models,einops,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,googleapis-common-protos,greenlet,h11,hf-xet,httpcore,httpx,huggingface-hub,idna,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema-specifications,latex2mathml,llama-stack-client,lxml,markdown-it-py,markupsafe,mdurl,mpire,mpmath,multidict,mypy-extensions,nest-asyncio,networkx,nltk,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,packaging,pandas,pillow,platformdirs,pluggy,prompt-toolkit,propcache,psycopg2-binary,pyaml,pydantic,pydantic-core,pydantic-settings,pygments,pyjwt,pylatexenc,python-dateutil,python-docx,python-dotenv,python-multipart,python-pptx,pytz,pyyaml,referencing,requests,rich,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,sentence-transformers,shapely,shellingham,six,sniffio,starlette,sympy,tabulate,tenacity,threadpoolctl,tiktoken,tokenizers,torch,torchvision,tqdm,transformers,triton,typer,typing-extensions,typing-inspect,typing-inspection,tzdata,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,opencv-python,rapidocr,sqlite-vec",
"os": "linux",
"arch": "x86_64,aarch64",
- "py_version": "312"
+ "py_version": 312
}
},
{
"type": "bundler",
- "allow_binary": "true"
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - name: prefetch-input | |
| # no source available: torch, faiss-cpu | |
| # hermeto prefetch problems: uv, pip, jiter, tiktoken, | |
| # those need cmake to build: pyarrow | |
| # those need cargo to build: jiter, tiktoken, cryptography, fastuuid, hf_xet, maturin, pydantic_core, rpds_py, safetensors, tokenizers | |
| # to accelerate build:numpy, scipy, pandas, pillow, scikit_learn | |
| value: | | |
| [ | |
| { | |
| "type": "rpm", | |
| "path": "." | |
| }, | |
| { | |
| "type": "pip", | |
| "path": ".", | |
| "requirements_files": [ | |
| "requirements.hashes.wheel.txt", | |
| "requirements.hashes.source.txt", | |
| "requirements.hermetic.txt" | |
| ], | |
| "requirements_build_files": ["requirements-build.txt"], | |
| "binary": { | |
| "packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,anyio,asyncpg,attrs,beautifulsoup4,cffi,chardet,charset-normalizer,click,colorama,cryptography,dataclasses-json,defusedxml,distro,docling-ibm-models,einops,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,googleapis-common-protos,greenlet,h11,hf-xet,httpcore,httpx,huggingface-hub,idna,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema-specifications,latex2mathml,llama-stack-client,lxml,markdown-it-py,markupsafe,mdurl,mpire,mpmath,multidict,mypy-extensions,nest-asyncio,networkx,nltk,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,packaging,pandas,pillow,platformdirs,pluggy,prompt-toolkit,propcache,psycopg2-binary,pyaml,pydantic,pydantic-core,pydantic-settings,pygments,pyjwt,pylatexenc,python-dateutil,python-docx,python-dotenv,python-multipart,python-pptx,pytz,pyyaml,referencing,requests,rich,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,sentence-transformers,shapely,shellingham,six,sniffio,starlette,sympy,tabulate,tenacity,threadpoolctl,tiktoken,tokenizers,torch,torchvision,tqdm,transformers,triton,typer,typing-extensions,typing-inspect,typing-inspection,tzdata,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,opencv-python,rapidocr,sqlite-vec", | |
| "os": "linux", | |
| "arch": "x86_64,aarch64", | |
| "py_version": "312" | |
| } | |
| }, | |
| { | |
| "type": "bundler", | |
| "allow_binary": "true" | |
| } | |
| - name: prefetch-input | |
| # no source available: torch, faiss-cpu | |
| # hermeto prefetch problems: uv, pip, jiter, tiktoken, | |
| # those need cmake to build: pyarrow | |
| # those need cargo to build: jiter, tiktoken, cryptography, fastuuid, hf_xet, maturin, pydantic_core, rpds_py, safetensors, tokenizers | |
| # to accelerate build:numpy, scipy, pandas, pillow, scikit_learn | |
| value: | | |
| [ | |
| { | |
| "type": "rpm", | |
| "path": "." | |
| }, | |
| { | |
| "type": "pip", | |
| "path": ".", | |
| "requirements_files": [ | |
| "requirements.hashes.wheel.txt", | |
| "requirements.hashes.source.txt", | |
| "requirements.hermetic.txt" | |
| ], | |
| "requirements_build_files": ["requirements-build.txt"], | |
| "binary": { | |
| "packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,anyio,asyncpg,attrs,beautifulsoup4,cffi,chardet,charset-normalizer,click,colorama,cryptography,dataclasses-json,defusedxml,distro,docling-ibm-models,einops,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,googleapis-common-protos,greenlet,h11,hf-xet,httpcore,httpx,huggingface-hub,idna,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema-specifications,latex2mathml,llama-stack-client,lxml,markdown-it-py,markupsafe,mdurl,mpire,mpmath,multidict,mypy-extensions,nest-asyncio,networkx,nltk,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,packaging,pandas,pillow,platformdirs,pluggy,prompt-toolkit,propcache,psycopg2-binary,pyaml,pydantic,pydantic-core,pydantic-settings,pygments,pyjwt,pylatexenc,python-dateutil,python-docx,python-dotenv,python-multipart,python-pptx,pytz,pyyaml,referencing,requests,rich,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,sentence-transformers,shapely,shellingham,six,sniffio,starlette,sympy,tabulate,tenacity,threadpoolctl,tiktoken,tokenizers,torch,torchvision,tqdm,transformers,triton,typer,typing-extensions,typing-inspect,typing-inspection,tzdata,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,opencv-python,rapidocr,sqlite-vec", | |
| "os": "linux", | |
| "arch": "x86_64,aarch64", | |
| "py_version": 312 | |
| } | |
| }, | |
| { | |
| "type": "bundler" | |
| } |
🤖 Prompt for AI Agents
In @.tekton/rag-tool-pull-request.yaml around lines 38 - 69, The prefetch-input
pip binary schema uses a string for "py_version" and an unsupported bundler
field "allow_binary"; update the pip block's "py_version" value from the quoted
string to an integer (py_version: 312) and remove the "allow_binary" key from
the bundler block (refer to the "prefetch-input" object, the "py_version" field
inside the pip -> binary map, and the "bundler" object) so the schema matches
Hermeto's expected types.
fabb9d2 to
eb1ecaf
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@scripts/konflux_requirements.sh`:
- Around line 71-77: The package extraction can still include index-url lines
when they have leading whitespace; update the wheel_packages pipeline to
robustly ignore commented and dash-prefixed lines by changing the grep to
exclude lines that start with optional whitespace followed by '#' or '-' (e.g.,
use grep -v '^[[:space:]]*[-#]'), keeping the rest of the pipeline (sed
's/==.*//' | tr '\n' ',' | sed 's/,$//') and then continue using wheel_packages,
EXTRA_WHEELS, PYPI_WHEELS and the two sed replacements that update
.tekton/rag-tool-pull-request.yaml and .tekton/rag-tool-push.yaml.
♻️ Duplicate comments (5)
scripts/konflux_requirements.sh (2)
83-86: Guard against empty PyPI wheel list before compiling hashes.If none of the
PYPI_WHEELSentries are present in the compiled requirements,$WHEEL_FILE_PYPIwill contain only comments.uv pip compilewill fail with "No requirements specified," breaking the script underset -e.🔧 Proposed fix
-uv pip compile "$WHEEL_FILE_PYPI" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$WHEEL_HASH_FILE_PYPI" +if grep -qEv '^[[:space:]]*($|#)' "$WHEEL_FILE_PYPI"; then + uv pip compile "$WHEEL_FILE_PYPI" --refresh --generate-hashes --python-version 3.12 --emit-index-url --no-deps --no-annotate > "$WHEEL_HASH_FILE_PYPI" +else + echo "# No PyPI wheel requirements" > "$WHEEL_HASH_FILE_PYPI" +fi
89-90: Make the maturin pin resilient to non-numeric version suffixes.The current regex
[0-9.]*only matches digits and dots. If the resolved version includes a suffix (e.g.,.post1,rc1), the substitution will produce an invalid version string.🔧 Proposed fix
-sed -i 's/maturin==[0-9.]*/maturin==1.10.2/' "$BUILD_FILE" +sed -i 's/maturin==[^[:space:]]*/maturin==1.10.2/' "$BUILD_FILE"pyproject.toml (1)
48-51: Stale comment: version pairing no longer matches code.Line 49 states "torch 2.5.x pairs with torchvision 0.20.x" but the actual pins are now
torch>=2.8.0,<2.9.0andtorchvision>=0.23.0,<0.24.0. Update the comment to reflect the current pairing.📝 Proposed fix
# Pin torch/torchvision to versions available as CPU wheels - # torch 2.5.x pairs with torchvision 0.20.x + # torch 2.8.x pairs with torchvision 0.23.x "torch>=2.8.0,<2.9.0", "torchvision>=0.23.0,<0.24.0",.tekton/rag-tool-push.yaml (1)
62-65: Remove unsupportedallow_binaryfield from bundler configuration.The
allow_binaryfield is not part of Hermeto's bundler package manager schema. This field should be removed to avoid prefetch failures.🔧 Proposed fix
{ "type": "bundler", - "allow_binary": "true" + "path": "." }Containerfile (1)
10-11: Quote the version spec to avoid shell redirection.The
>character inuv>=0.7.20will be interpreted by the shell as output redirection, creating a file named=0.7.20and installinguvwithout version constraints.🐛 Proposed fix
# Install uv package manager -RUN pip3.12 install uv>=0.7.20 +RUN pip3.12 install "uv>=0.7.20"
🧹 Nitpick comments (4)
scripts/konflux_requirements.sh (2)
56-58: Package name extraction may fail for packages with extras.The sed pattern
s/[=<>!].*//'won't correctly extract the package name if it includes extras (e.g.,package[extra]==1.0). The brackets would remain in the extracted name.♻️ Proposed fix
- package_name=$(echo "$current_package" | sed 's/[=<>!].*//') + package_name=$(echo "$current_package" | sed 's/\[.*\]//; s/[=<>!].*//')
96-99: Version counting regex misses versions with suffixes.The
grep -Eo '==[0-9.]+'pattern won't match versions like==1.0.post1or==2.0rc1, leading to inaccurate package counts in the summary output.♻️ Proposed fix
-echo "Packages from pypi.org written to: $SOURCE_HASH_FILE ($( grep -Eo '==[0-9.]+' "$SOURCE_HASH_FILE" | wc -l) packages)" -echo "Packages from console.redhat.com written to: $WHEEL_HASH_FILE ($(grep -Eo '==[0-9.]+' "$WHEEL_HASH_FILE" | wc -l) packages)" -echo "Packages from pypi.org (wheels) written to: $WHEEL_HASH_FILE_PYPI ($(grep -Eo '==[0-9.]+' "$WHEEL_HASH_FILE_PYPI" | wc -l) packages)" -echo "Build dependencies written to: $BUILD_FILE ($(grep -Eo '==[0-9.]+' "$BUILD_FILE" | wc -l) packages)" +echo "Packages from pypi.org written to: $SOURCE_HASH_FILE ($( grep -Eo '==[^[:space:]]+' "$SOURCE_HASH_FILE" | wc -l) packages)" +echo "Packages from console.redhat.com written to: $WHEEL_HASH_FILE ($(grep -Eo '==[^[:space:]]+' "$WHEEL_HASH_FILE" | wc -l) packages)" +echo "Packages from pypi.org (wheels) written to: $WHEEL_HASH_FILE_PYPI ($(grep -Eo '==[^[:space:]]+' "$WHEEL_HASH_FILE_PYPI" | wc -l) packages)" +echo "Build dependencies written to: $BUILD_FILE ($(grep -Eo '==[^[:space:]]+' "$BUILD_FILE" | wc -l) packages)".tekton/rag-tool-push.yaml (1)
27-29: Platform mismatch between build-platforms and binary.arch.
build-platformsspecifies onlylinux/x86_64, butbinary.archincludesx86_64,aarch64. This could cause prefetch to download aarch64 wheels that won't be used during the build, increasing artifact size unnecessarily.Consider either:
- Adding
linux/aarch64tobuild-platformsif multi-arch builds are intended- Removing
aarch64frombinary.archif only x86_64 builds are needed♻️ Option 2: Align arch with build-platforms
"binary": { "packages": "accelerate,aiohappyeyeballs,...", "os": "linux", - "arch": "x86_64,aarch64", + "arch": "x86_64", "py_version": 312 }Also applies to: 55-59
requirements.hermetic.txt (1)
1-4: Inconsistent version pinning strategy.
uv-build,uv, andpipuse exact pins (==), butninjauses a lower bound (>=1.5). For fully reproducible hermetic builds, consider pinningninjato an exact version.♻️ Proposed fix
uv-build==0.9.26 uv==0.9.26 pip==25.3 -ninja>=1.5 +ninja==1.13.0
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@requirements.hermetic.txt`:
- Line 4: Replace the loose dependency "ninja>=1.5" with an exact pinned version
to ensure hermetic reproducibility: update the requirement line "ninja>=1.5" to
"ninja==<exact-version>" (use the exact ninja version used by CI/builds or the
lockfile, e.g., "ninja==1.10.2") so the file pins a single immutable version.
e504787 to
a0710e9
Compare
|
/retest |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In @.tekton/rag-tool-pull-request.yaml:
- Around line 38-70: The prefetch-input value is currently a top-level JSON
array but the prefetch-dependencies-oci-ta task expects an object with a
"packages" array (and optional "flags"); update the value for the
"prefetch-input" entry so the existing array becomes the value of a "packages"
key (i.e., replace the top-level [...] with {"packages": [...]}), preserving all
current entries (rpm/pip/bundler objects and the pip binary packages list) and
add "flags" only if needed; ensure the resulting JSON object is valid for the
prefetch-dependencies-oci-ta task.
In @.tekton/rag-tool-push.yaml:
- Around line 34-66: The prefetch-input currently contains a top-level JSON
array but prefetch-dependencies-oci-ta expects an object with a required
"packages" array (and optional "flags"). Replace the value of prefetch-input so
that the existing array becomes the value of the "packages" key (i.e., set value
to { "packages": [ ...existing entries... ] }) and add any optional "flags" if
needed; update the block that defines prefetch-input in the YAML (the value
under the name prefetch-input) rather than other fields.
🧹 Nitpick comments (1)
Containerfile (1)
24-25: Consider using frozen mode for bundle install.The current
bundle installcommand doesn't prevent lockfile modifications. For hermetic builds, consider usingBUNDLE_FROZEN=1to ensure the lockfile isn't modified during build.♻️ Suggested improvement
# Install Ruby Gems -RUN BUNDLE_PATH__SYSTEM=true bundle install +RUN BUNDLE_FROZEN=1 BUNDLE_PATH__SYSTEM=true bundle install
| - name: prefetch-input | ||
| # no source available: torch, faiss-cpu | ||
| # hermeto prefetch problems: uv, pip, jiter, tiktoken, | ||
| # those need cmake to build: pyarrow | ||
| # those need cargo to build: jiter, tiktoken, cryptography, fastuuid, hf_xet, maturin, pydantic_core, rpds_py, safetensors, tokenizers | ||
| # to accelerate build:numpy, scipy, pandas, pillow, scikit_learn | ||
| value: | | ||
| [ | ||
| { | ||
| "type": "rpm", | ||
| "path": "." | ||
| }, | ||
| { | ||
| "type": "pip", | ||
| "path": ".", | ||
| "requirements_files": [ | ||
| "requirements.hashes.wheel.txt", | ||
| "requirements.hashes.source.txt", | ||
| "requirements.hermetic.txt" | ||
| ], | ||
| "requirements_build_files": ["requirements-build.txt"], | ||
| "binary": { | ||
| "packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,anyio,asyncpg,attrs,beautifulsoup4,cffi,chardet,charset-normalizer,click,colorama,dataclasses-json,defusedxml,distro,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,googleapis-common-protos,h11,hf-xet,httpcore,httpx,idna,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema-specifications,latex2mathml,lxml,markdown-it-py,markupsafe,mdurl,mpire,mpmath,mypy-extensions,nest-asyncio,networkx,nltk,numpy,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,packaging,pandas,pillow,platformdirs,pluggy,prompt-toolkit,propcache,psycopg2-binary,pyaml,pydantic,pydantic-core,pydantic-settings,pygments,pylatexenc,python-dateutil,python-docx,python-dotenv,python-pptx,pytz,pyyaml,referencing,requests,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,setuptools,shapely,shellingham,six,sniffio,starlette,sympy,tabulate,threadpoolctl,tiktoken,tokenizers,torch,torchvision,tornado,transformers,triton,typing-extensions,typing-inspect,typing-inspection,tzdata,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,opencv-python,rapidocr,sqlite-vec", | ||
| "os": "linux", | ||
| "arch": "x86_64,aarch64", | ||
| "py_version": 312 | ||
| } | ||
| }, | ||
| { | ||
| "type": "bundler", | ||
| "allow_binary": "true" | ||
| } | ||
| ] |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's explore the repository structure
git ls-files | grep -E "(tekton|prefetch)" | head -20Repository: lightspeed-core/rag-content
Length of output: 173
🏁 Script executed:
# Search for prefetch task definitions
find . -type f -name "*.yaml" -o -name "*.yml" | xargs grep -l "prefetch-dependencies-oci-ta" 2>/dev/null | head -10Repository: lightspeed-core/rag-content
Length of output: 136
🏁 Script executed:
# Look for the actual task definition
find . -type f \( -name "*.yaml" -o -name "*.yml" \) | xargs grep -l "kind: Task" 2>/dev/null | grep -i prefetchRepository: lightspeed-core/rag-content
Length of output: 53
🏁 Script executed:
# Search more broadly for task definitions
find . -type f \( -name "*.yaml" -o -name "*.yml" \) | xargs grep -B5 -A10 "prefetch" 2>/dev/null | head -100Repository: lightspeed-core/rag-content
Length of output: 6773
🌐 Web query:
prefetch-dependencies-oci-ta task input schema packages Konflux Tekton
💡 Result:
Web search failed: Server error: no LLM provider could handle the message
🏁 Script executed:
cat -n scripts/generate_packages_to_prefetch.pyRepository: lightspeed-core/rag-content
Length of output: 8324
🏁 Script executed:
# Also check if there are any references to prefetch-input structure in the codebase
rg -A5 -B5 "prefetch-input" --type yamlRepository: lightspeed-core/rag-content
Length of output: 53
🏁 Script executed:
# Look for any documentation or comments about the prefetch input format
rg "packages.*rpm.*pip" --type yaml -A3 -B3Repository: lightspeed-core/rag-content
Length of output: 53
🏁 Script executed:
cat -n .tekton/rag-tool-push.yaml | grep -A50 "prefetch-input"Repository: lightspeed-core/rag-content
Length of output: 11896
🏁 Script executed:
# Search for any documentation about this task in the repository
find . -type f \( -name "*.md" -o -name "*.txt" -o -name "*.rst" \) | xargs grep -i "prefetch" 2>/dev/nullRepository: lightspeed-core/rag-content
Length of output: 123
🏁 Script executed:
# Check the Tekton task bundle to understand the input parameter
# Let's search for any comments or docs about the input schema in the yaml files
rg "input" .tekton/ -A2 -B2 --type yaml | grep -A5 -B5 "prefetch"Repository: lightspeed-core/rag-content
Length of output: 4729
🌐 Web query:
Konflux CI task-prefetch-dependencies-oci-ta input schema documentation
💡 Result:
Konflux’s task-prefetch-dependencies-oci-ta “input” (often surfaced at the Pipeline level as the prefetch-input param) is a JSON string that’s passed through to the dependency-prefetcher used by the task (historically cachi2, now documented as Hermeto). You won’t usually find a separate “input schema” page for the Tekton Task itself; the schema is defined by Hermeto’s “fetch-deps JSON input” formats. [1]
What the JSON looks like (as used by Konflux)
An example captured from a real Konflux prefetch-dependencies invocation shows the task parameter named input containing JSON like: {"packages":[{"path":"yq","type":"gomod"},{"type":"rpm"}],"flags":["gomod-vendor"]}. [2]
Where the input schema is documented
Hermeto documents the JSON shape per package manager; each entry is essentially an object with at least:
type: package manager (e.g.,gomod,yarn,rpm,pip, etc.)- optional
path: relative path (defaults to"."for many managers)
Examples:
- Go modules (
gomod) JSON input format:{ "type": "gomod", "path": "." }. [1] - Yarn JSON input format:
{ "type": "yarn", "path": "." }. [3]
Practical tip (to confirm the exact param name/shape for your bundle version)
Konflux documents using tkn bundle list -o yaml <bundle-ref> to inspect the Task definition embedded in the OCI bundle your pipeline references, which is the most reliable way to see the exact parameter names for that released task version. [4]
Sources: Hermeto docs (gomod/yarn) [1][3]; Red Hat Konflux provenance example showing prefetch-dependencies input JSON [2]; Konflux docs showing how to inspect bundle-embedded Tasks [4].
Wrap prefetch-input value in a JSON object with packages key.
The prefetch-dependencies-oci-ta task expects the input parameter to be a JSON object with a packages array and optional flags array (e.g., {"packages": [...], "flags": [...]}). The current top-level array will be rejected by Hermeto. Update the value to:
Required wrapper
value: |
- [
+ {
+ "packages": [
{
"type": "rpm",
"path": "."
},
...
- ]
+ ]
+ }🤖 Prompt for AI Agents
In @.tekton/rag-tool-pull-request.yaml around lines 38 - 70, The prefetch-input
value is currently a top-level JSON array but the prefetch-dependencies-oci-ta
task expects an object with a "packages" array (and optional "flags"); update
the value for the "prefetch-input" entry so the existing array becomes the value
of a "packages" key (i.e., replace the top-level [...] with {"packages":
[...]}), preserving all current entries (rpm/pip/bundler objects and the pip
binary packages list) and add "flags" only if needed; ensure the resulting JSON
object is valid for the prefetch-dependencies-oci-ta task.
| - name: prefetch-input | ||
| # no source available: torch, faiss-cpu | ||
| # hermeto prefetch problems: uv, pip, jiter, tiktoken, | ||
| # those need cmake to build: pyarrow | ||
| # those need cargo to build: jiter, tiktoken, cryptography, fastuuid, hf_xet, maturin, pydantic_core, rpds_py, safetensors, tokenizers | ||
| # to accelerate build:numpy, scipy, pandas, pillow, scikit_learn | ||
| value: | | ||
| [ | ||
| { | ||
| "type": "rpm", | ||
| "path": "." | ||
| }, | ||
| { | ||
| "type": "pip", | ||
| "path": ".", | ||
| "requirements_files": [ | ||
| "requirements.hashes.wheel.txt", | ||
| "requirements.hashes.source.txt", | ||
| "requirements.hermetic.txt" | ||
| ], | ||
| "requirements_build_files": ["requirements-build.txt"], | ||
| "binary": { | ||
| "packages": "accelerate,aiohappyeyeballs,aiohttp,aiosignal,aiosqlite,annotated-doc,annotated-types,anyio,asyncpg,attrs,beautifulsoup4,cffi,chardet,charset-normalizer,click,colorama,dataclasses-json,defusedxml,distro,et-xmlfile,faiss-cpu,filetype,fire,frozenlist,googleapis-common-protos,h11,hf-xet,httpcore,httpx,idna,jinja2,jiter,joblib,jsonlines,jsonref,jsonschema-specifications,latex2mathml,lxml,markdown-it-py,markupsafe,mdurl,mpire,mpmath,mypy-extensions,nest-asyncio,networkx,nltk,numpy,openpyxl,opentelemetry-api,opentelemetry-exporter-otlp-proto-common,opentelemetry-exporter-otlp-proto-http,opentelemetry-proto,opentelemetry-sdk,opentelemetry-semantic-conventions,packaging,pandas,pillow,platformdirs,pluggy,prompt-toolkit,propcache,psycopg2-binary,pyaml,pydantic,pydantic-core,pydantic-settings,pygments,pylatexenc,python-dateutil,python-docx,python-dotenv,python-pptx,pytz,pyyaml,referencing,requests,rpds-py,rtree,safetensors,scikit-learn,scipy,semchunk,setuptools,shapely,shellingham,six,sniffio,starlette,sympy,tabulate,threadpoolctl,tiktoken,tokenizers,torch,torchvision,tornado,transformers,triton,typing-extensions,typing-inspect,typing-inspection,tzdata,wrapt,xlsxwriter,yarl,zipp,uv-build,uv,pip,maturin,opencv-python,rapidocr,sqlite-vec", | ||
| "os": "linux", | ||
| "arch": "x86_64,aarch64", | ||
| "py_version": 312 | ||
| } | ||
| }, | ||
| { | ||
| "type": "bundler", | ||
| "allow_binary": "true" | ||
| } | ||
| ] |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Tekton prefetch-dependencies-oci-ta input schema packages
💡 Result:
In Konflux’s Tekton task prefetch-dependencies-oci-ta, the input you pass (usually via the Pipeline param prefetch-input → Task param input) is a JSON string describing what Cachi2 should prefetch. (gist.github.com)
input JSON shape (what you’re calling “schema”)
{
"packages": [
{
"type": "<package-manager>",
"path": "<subdir-optional>"
}
],
"flags": ["<flag-1>", "<flag-2>"]
}packages (required)
- Array of objects, each describing one package-set to prefetch. (developers.redhat.com)
- Fields:
type(required): package-manager identifier (example seen in the wild:gomod,rpm). (developers.redhat.com)path(optional): subdirectory within the source repo for that package manager (used when you have multiple packages/projects in one repo). (developers.redhat.com)
flags (optional)
- Array of strings with additional behavior toggles (example seen:
gomod-vendor). (developers.redhat.com)
Example (from a real build)
{
"packages": [
{ "path": "yq", "type": "gomod" },
{ "type": "rpm" }
],
"flags": ["gomod-vendor"]
}
``` ([developers.redhat.com](https://developers.redhat.com/articles/2025/05/15/how-we-use-software-provenance-red-hat?utm_source=openai))
Citations:
- 1: https://gist.github.com/brianwcook/1c9aae1c096cce81f16101af1086506c?utm_source=openai
- 2: https://developers.redhat.com/articles/2025/05/15/how-we-use-software-provenance-red-hat?utm_source=openai
- 3: https://developers.redhat.com/articles/2025/05/15/how-we-use-software-provenance-red-hat?utm_source=openai
- 4: https://developers.redhat.com/articles/2025/05/15/how-we-use-software-provenance-red-hat?utm_source=openai
- 5: https://developers.redhat.com/articles/2025/05/15/how-we-use-software-provenance-red-hat?utm_source=openai
- 6: https://developers.redhat.com/articles/2025/05/15/how-we-use-software-provenance-red-hat?utm_source=openai
---
🏁 Script executed:
```shell
# Search for references to prefetch task in the repository
rg -i "prefetch" --type yaml -B 5 -A 5 | head -100Repository: lightspeed-core/rag-content
Length of output: 53
🏁 Script executed:
# Look for task definitions or documentation files
fd -e yaml -e md -e txt | xargs rg -l "prefetch-dependencies" 2>/dev/null | head -20Repository: lightspeed-core/rag-content
Length of output: 53
🏁 Script executed:
# Verify the actual current structure in the yaml file
cat -n .tekton/rag-tool-push.yaml | sed -n '34,70p'Repository: lightspeed-core/rag-content
Length of output: 2696
Wrap prefetch-input value in JSON object with packages key.
The prefetch-dependencies-oci-ta task expects input as a JSON object with required packages array and optional flags array. The current top-level array will be rejected. Wrap the array in an object as shown.
Proposed fix
value: |
- [
+ {
+ "packages": [
{
"type": "rpm",
"path": "."
},
{
"type": "pip",
"path": ".",
"requirements_files": [
"requirements.hashes.wheel.txt",
"requirements.hashes.source.txt",
"requirements.hermetic.txt"
],
"requirements_build_files": ["requirements-build.txt"],
"binary": {
"packages": "...",
"os": "linux",
"arch": "x86_64,aarch64",
"py_version": 312
}
},
{
"type": "bundler",
"allow_binary": "true"
}
- ]
+ ]
+ }🤖 Prompt for AI Agents
In @.tekton/rag-tool-push.yaml around lines 34 - 66, The prefetch-input
currently contains a top-level JSON array but prefetch-dependencies-oci-ta
expects an object with a required "packages" array (and optional "flags").
Replace the value of prefetch-input so that the existing array becomes the value
of the "packages" key (i.e., set value to { "packages": [ ...existing entries...
] }) and add any optional "flags" if needed; update the block that defines
prefetch-input in the YAML (the value under the name prefetch-input) rather than
other fields.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@Containerfile`:
- Around line 36-45: The Cachi2 branch of the Containerfile install omits the
PyPI wheel hash file, causing required wheel-only packages to be missing; update
the Dockerfile COPY that brings hash files into the image to also copy
requirements.hashes.wheel.pypi.txt, and then update the Cachi2 pip install
invocation (the command that runs after ". .venv/bin/activate" and currently
lists "-r requirements.hashes.wheel.txt -r requirements.hashes.source.txt") to
include "-r requirements.hashes.wheel.pypi.txt" so pip installs those PyPI wheel
packages during the hermetic Cachi2 install.
In `@rpms.in.yaml`:
- Around line 1-11: The packages list in rpms.in.yaml is missing
rubygem-bundler; update the packages array (the packages: [...] block) to
include "rubygem-bundler" alongside the existing entries (gcc, cmake, cargo,
rubygems, python3.12, python3.12-devel, python3.12-pip, git) so the manifest
matches the Containerfile and generated rpms.lock.yaml and ensures a hermetic
build.
c91db05 to
15501eb
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@redhat.repo`:
- Around line 1-69: The repo file uses the non-standard option enabled_metadata
(seen in sections like [codeready-builder-for-rhel-9-$basearch-eus-rpms],
[rhel-9-for-$basearch-appstream-eus-rpms], etc.); confirm that the consumer
toolchain (libdnf5 or any custom parser) intentionally requires/accepts
enabled_metadata and if not, remove or replace it with a supported option; also
ensure the file ends with a single trailing newline character to satisfy POSIX
tools and linters.
🧹 Nitpick comments (2)
Gemfile (1)
1-3: Consider pinning the asciidoctor gem version for reproducibility.For hermetic builds, pinning gem versions in the Gemfile (e.g.,
gem "asciidoctor", "~> 2.0") provides an additional layer of defense alongsideGemfile.lock. This ensures the dependency range is documented even if the lockfile is regenerated.Also, the file is missing a trailing newline.
📝 Suggested improvement
source "https://rubygems.org" -gem "asciidoctor" +gem "asciidoctor", "~> 2.0" +requirements.hermetic.txt (1)
1-4: Missing trailing newline.The file should end with a newline character for POSIX compliance.
📝 Suggested fix
uv-build==0.9.26 uv==0.9.26 pip==25.3 -ninja==1.13 +ninja==1.13.0 +
| [codeready-builder-for-rhel-9-$basearch-eus-rpms] | ||
| name = Red Hat CodeReady Linux Builder for RHEL 9 $basearch - Extended Update Support (RPMs) | ||
| baseurl = https://cdn.redhat.com/content/eus/rhel9/9.6/$basearch/codeready-builder/os | ||
| enabled = 1 | ||
| gpgcheck = 1 | ||
| gpgkey = file:///etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release | ||
| sslverify = 1 | ||
| sslcacert = /etc/rhsm/ca/redhat-uep.pem | ||
| sslverifystatus = 1 | ||
| metadata_expire = 86400 | ||
| enabled_metadata = 0 | ||
| sslclientkey = $SSL_CLIENT_KEY | ||
| sslclientcert = $SSL_CLIENT_CERT | ||
|
|
||
| [rhel-9-for-$basearch-appstream-eus-rpms] | ||
| name = Red Hat Enterprise Linux 9 for $basearch - AppStream - Extended Update Support (RPMs) | ||
| baseurl = https://cdn.redhat.com/content/eus/rhel9/9.6/$basearch/appstream/os | ||
| enabled = 1 | ||
| gpgcheck = 1 | ||
| gpgkey = file:///etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release | ||
| sslverify = 1 | ||
| sslcacert = /etc/rhsm/ca/redhat-uep.pem | ||
| sslverifystatus = 1 | ||
| metadata_expire = 86400 | ||
| enabled_metadata = 0 | ||
| sslclientkey = $SSL_CLIENT_KEY | ||
| sslclientcert = $SSL_CLIENT_CERT | ||
|
|
||
| [rhel-9-for-$basearch-baseos-eus-rpms] | ||
| name = Red Hat Enterprise Linux 9 for $basearch - BaseOS - Extended Update Support (RPMs) | ||
| baseurl = https://cdn.redhat.com/content/eus/rhel9/9.6/$basearch/baseos/os | ||
| enabled = 1 | ||
| gpgcheck = 1 | ||
| gpgkey = file:///etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release | ||
| sslverify = 1 | ||
| sslcacert = /etc/rhsm/ca/redhat-uep.pem | ||
| sslverifystatus = 1 | ||
| metadata_expire = 86400 | ||
| enabled_metadata = 0 | ||
| sslclientkey = $SSL_CLIENT_KEY | ||
| sslclientcert = $SSL_CLIENT_CERT | ||
|
|
||
| [rhocp-4.17-for-rhel-9-$basearch-rpms] | ||
| name = Red Hat OpenShift Container Platform 4.17 for RHEL 9 $basearch (RPMs) | ||
| baseurl = https://cdn.redhat.com/content/dist/layered/rhel9/$basearch/rhocp/4.17/os | ||
| enabled = 0 | ||
| gpgcheck = 1 | ||
| gpgkey = file:///etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release | ||
| sslverify = 1 | ||
| sslcacert = /etc/rhsm/ca/redhat-uep.pem | ||
| sslverifystatus = 1 | ||
| metadata_expire = 86400 | ||
| enabled_metadata = 0 | ||
| sslclientkey = $SSL_CLIENT_KEY | ||
| sslclientcert = $SSL_CLIENT_CERT | ||
|
|
||
| [rhocp-4.17-for-rhel-9-$basearch-source-rpms] | ||
| name = Red Hat OpenShift Container Platform 4.17 for RHEL 9 $basearch (Source RPMs) | ||
| baseurl = https://cdn.redhat.com/content/dist/layered/rhel9/$basearch/rhocp/4.17/source/SRPMS | ||
| enabled = 0 | ||
| gpgcheck = 1 | ||
| gpgkey = file:///etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release | ||
| sslverify = 1 | ||
| sslcacert = /etc/rhsm/ca/redhat-uep.pem | ||
| sslverifystatus = 1 | ||
| metadata_expire = 86400 | ||
| enabled_metadata = 0 | ||
| sslclientkey = $SSL_CLIENT_KEY | ||
| sslclientcert = $SSL_CLIENT_CERT |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
enabled_metadata yum dnf repository configuration option
💡 Result:
enabled_metadata is a non-standard key you may see in /etc/yum.repos.d/*.repo entries. It is primarily used by PackageKit / GUI software centers (e.g., GNOME Software) to treat a repo as “metadata-enabled” even when it is not enabled as a package source (enabled=0). This allows apps to be discoverable/searchable via metadata without actually enabling the repo for installs/updates. [1]
DNF/YUM repo configuration documentation does not list enabled_metadata as a supported repo option (so it is typically ignored by DNF/YUM, aside from possible “unknown option” warnings depending on version), but libdnf5 does have an enabled_metadata repo config option in its API (get_enabled_metadata_option). [2]
Example usage (as commonly documented in Fedora-related guidance):
[myrepo]
baseurl=https://example.com/repo
enabled=0
enabled_metadata=1[1] (fedoraproject.org)
[2] (dnf5.readthedocs.io)
Citations:
- 1: https://fedoraproject.org/wiki/Changes/Third_Party_Software_Mechanism?utm_source=openai
- 2: https://dnf5.readthedocs.io/en/latest/api/c%2B%2B/libdnf5_xdg.html
🌐 Web query:
POSIX trailing newline requirement text files standard practice
💡 Result:
Under POSIX terminology, a “line” is defined as “a sequence of zero or more non-newline characters plus a terminating newline,” and a “text file” is a file containing characters organized into zero or more lines—so a non-empty POSIX “text file” is expected to have its last line newline-terminated (otherwise the file ends with an “incomplete line”). Missing the final newline therefore makes the last record not a POSIX line, strictly speaking. [1][2]
Standard practice: still end text files with a trailing newline because many Unix/POSIX-oriented tools are line-based and assume newline-terminated records; omitting it can cause annoyances like prompts/diffs/concatenation output appearing on the same line or tools needing special-case handling of the last “line.” [1][3]
Practical note: some POSIX utilities/editors explicitly tolerate an “incomplete last line” and may treat it as if it had a trailing newline (e.g., historical ex/vi behavior discussed in POSIX contexts), but that’s a compatibility concession rather than the ideal format. [2]
Sources: [1] https://stackoverflow.com/questions/729692/why-should-text-files-end-with-a-newline
[2] https://unix.stackexchange.com/questions/263869/does-vi-silently-add-a-newline-lf-at-the-end-of-file
[3] microsoft/vscode#141169
LGTM!
The RHEL 9 EUS repository configuration provides appropriate security settings with GPG verification and SSL client authentication. The EUS repos pinned to 9.6 ensure stability for hermetic builds.
Minor observations:
- The
enabled_metadata = 0option is non-standard for yum/dnf repo files. While recognized by libdnf5, DNF/YUM documentation does not list it as a supported option. Verify this is intentional and supported by the tooling consuming this file. - Add a trailing newline at the end of the file for POSIX compliance.
🤖 Prompt for AI Agents
In `@redhat.repo` around lines 1 - 69, The repo file uses the non-standard option
enabled_metadata (seen in sections like
[codeready-builder-for-rhel-9-$basearch-eus-rpms],
[rhel-9-for-$basearch-appstream-eus-rpms], etc.); confirm that the consumer
toolchain (libdnf5 or any custom parser) intentionally requires/accepts
enabled_metadata and if not, remove or replace it with a supported option; also
ensure the file ends with a single trailing newline character to satisfy POSIX
tools and linters.
c81df8b to
03e479e
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@Containerfile`:
- Around line 52-59: Fix the shell conditional typo in the Dockerfile RUN
command: replace the incorrect token "f" with "if" at the start of the
conditional (the RUN block that begins "f [ -f /cachi2/cachi2.env ]; then \").
Ensure the conditional reads "if [ -f /cachi2/cachi2.env ]; then ..." so the
shell executes the if-statement and the subsequent mkdir/cp vs. python
download_embeddings_model.py branches correctly.
- Around line 53-54: The hermetic build currently only copies a single
safetensors file (the cp /cachi2/output/deps/generic/model.safetensors step)
which omits required sentence-transformers files; update the hermetic path to
copy the full model directory or explicitly include config.json, tokenizer.json,
tokenizer_config.json, special_tokens_map.json, the 2_Normalize/ directory and
version.txt (e.g., replace the single-file cp with a recursive copy of
/cachi2/output/deps/generic/* into embeddings_model/), or ensure the Cachi2
artifact is produced with the complete model layout so loading (vs. the
snapshot_download() non-hermetic path) succeeds at runtime.
In `@requirements.hermetic.txt`:
- Around line 1-4: The file requirements.hermetic.txt is missing a trailing
newline; open requirements.hermetic.txt and add a single newline character at
the end of the file (after the last line "ninja==1.13"), ensuring you don't
alter the pinned entries (uv-build==0.9.26, uv==0.9.26, pip==25.3, ninja==1.13)
or introduce extra whitespace.
🧹 Nitpick comments (1)
Containerfile (1)
24-25: Consider addingBUNDLE_FROZEN=1for reproducible builds.
BUNDLE_PATH__SYSTEM=truecontrols the install location but doesn't preventGemfile.lockmodification. AddingBUNDLE_FROZEN=1ensuresbundle installfails if the lockfile would change, which is important for reproducibility — especially in the hermetic build path where the lockfile should be authoritative.♻️ Suggested improvement
-RUN BUNDLE_PATH__SYSTEM=true bundle install +RUN BUNDLE_PATH__SYSTEM=true BUNDLE_FROZEN=1 bundle install
| uv-build==0.9.26 | ||
| uv==0.9.26 | ||
| pip==25.3 | ||
| ninja==1.13 No newline at end of file |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check if each pinned version exists on PyPI
for pkg_ver in "uv-build/0.9.26" "uv/0.9.26" "pip/25.3" "ninja/1.13"; do
pkg=$(echo "$pkg_ver" | cut -d'/' -f1)
ver=$(echo "$pkg_ver" | cut -d'/' -f2)
status=$(curl -s -o /dev/null -w "%{http_code}" "https://pypi.org/pypi/${pkg}/${ver}/json")
echo "${pkg}==${ver} -> HTTP ${status}"
done
# Check if file has trailing newline
if [ -f "requirements.hermetic.txt" ]; then
echo "---"
if [ -z "$(tail -c 1 requirements.hermetic.txt)" ]; then
echo "requirements.hermetic.txt: Has trailing newline"
else
echo "requirements.hermetic.txt: MISSING trailing newline"
fi
fiRepository: lightspeed-core/rag-content
Length of output: 224
Add a trailing newline to the file.
The file is missing a trailing newline, which can cause issues with POSIX tools and file concatenation. All pinned versions (uv-build==0.9.26, uv==0.9.26, pip==25.3, ninja==1.13) are valid published releases on PyPI.
🤖 Prompt for AI Agents
In `@requirements.hermetic.txt` around lines 1 - 4, The file
requirements.hermetic.txt is missing a trailing newline; open
requirements.hermetic.txt and add a single newline character at the end of the
file (after the last line "ninja==1.13"), ensuring you don't alter the pinned
entries (uv-build==0.9.26, uv==0.9.26, pip==25.3, ninja==1.13) or introduce
extra whitespace.
There was a problem hiding this comment.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
.tekton/rag-tool-push.yaml (1)
27-29:⚠️ Potential issue | 🟡 MinorPrefetch fetches aarch64 wheels but push pipeline only builds x86_64.
The push pipeline's
build-platformsislinux/x86_64only (line 29), yet the pipbinary.archfield at line 63 requests wheels for bothx86_64,aarch64. This will prefetch arm64 wheels that are never used in this pipeline, wasting build time and storage. Either addlinux-c6gd2xlarge/arm64tobuild-platforms(as done in the PR pipeline) or narrowarchtox86_64here.Also applies to: 60-64
🤖 Fix all issues with AI agents
In `@artifacts.lock.yaml`:
- Around line 4-7: The artifacts.lock currently only includes
"model.safetensors" so the hermetic build will miss other files needed by
sentence-transformers; update the artifacts list to add entries for every
required file from the sentence-transformers/all-mpnet-base-v2 repo (e.g.,
config.json, tokenizer.json, tokenizer_config.json, special_tokens_map.json,
vocab files or vocab.json, merges.txt if present, and any model index files),
making each a separate artifact entry with its own download_url and sha256
checksum (use the same download URL pattern and compute checksums), and ensure
you preserve the existing filename fields; run the suggested shell checks to
enumerate the repo files and verify you included each required filename in
artifacts with correct checksums.
In `@Containerfile`:
- Around line 24-25: The Dockerfile currently runs "RUN BUNDLE_PATH__SYSTEM=true
bundle install" unconditionally which will fail in hermetic builds; modify the
Dockerfile to either move that RUN line inside the same Cachi2 conditional that
protects other Cachi2 steps (so bundle install only runs when Cachi2 is enabled)
or ensure Cachi2 prefetching provides the gem cache before this RUN by adding
the bundle cache/prefetch step and using that cache as the source for "bundle
install" (reference the RUN BUNDLE_PATH__SYSTEM=true bundle install step and the
Cachi2-related conditional to locate where to change).
In `@pyproject.toml`:
- Around line 91-92: The pyproject.toml currently pins "pip==24.3.1" which is
outdated; either update this dependency to the latest stable pip (replace
"pip==24.3.1" with the current version) or add a clear comment next to the
"pip==24.3.1" entry explaining why that exact pip version is required for the
hermetic build (e.g., known compatibility, reproducibility, or security
exemption); ensure the change is made in pyproject.toml where the "pip==24.3.1"
entry appears and keep the "pybuild-deps" line untouched unless it also needs
aligning.
In `@scripts/konflux_requirements.sh`:
- Around line 24-33: The current uv pip compile invocation (the command writing
to "$RAW_REQ_FILE" with --python-platform x86_64-unknown-linux-gnu) resolves
dependencies for x86_64 only, missing aarch64-specific packages; either add
--universal to this initial resolution command to include all platform variants,
or run the resolution step twice (once with --python-platform
x86_64-unknown-linux-gnu and once with --python-platform
aarch64-unknown-linux-gnu) and merge the results into "$RAW_REQ_FILE"; also
apply the same --universal or per-platform approach to the later wheel/hash
compilation steps (the wheel hash compilation referenced by the --universal flag
on the later block) so platform-specific wheels are discovered.
🧹 Nitpick comments (2)
pyproject.toml (1)
48-53: Comment-dependency separation reduces readability.The torch/torchvision pairing comment (lines 48–49) is separated from the actual
torchandtorchvisionpins (lines 52–53) by the unrelatedllama-stackentries. Move the llama-stack entries above the comment block, or movetorch/torchvisiondirectly after the comment for better clarity.📝 Suggested reorder
"llama-index-vector-stores-postgres>=0.5.4", + "llama-stack==0.4.3", + "llama-stack-client==0.4.3", # Pin torch/torchvision to versions available as CPU wheels # torch 2.8.x pairs with torchvision 0.23.x - "llama-stack==0.4.3", - "llama-stack-client==0.4.3", "torch>=2.8.0,<2.9.0", "torchvision>=0.23.0,<0.24.0",.tekton/rag-tool-pull-request.yaml (1)
36-37: Consider disablingbuild-source-imagefor the pull-request pipeline.Building a source image on every PR adds significant overhead to the build cycle. Source images are typically only necessary for release/push builds (provenance). The push pipeline already has this set to
true. Setting it tofalsehere would speed up PR feedback.Suggested change
- name: build-source-image - value: 'true' + value: 'false'
| artifacts: | ||
| - download_url: "https://huggingface.co/sentence-transformers/all-mpnet-base-v2/resolve/main/model.safetensors" | ||
| checksum: "sha256:78c0197b6159d92658e319bc1d72e4c73a9a03dd03815e70e555c5ef05615658" | ||
| filename: "model.safetensors" |
There was a problem hiding this comment.
Only model.safetensors is listed — additional model files are needed for sentence-transformers.
This artifact lock drives what Cachi2 prefetches. A sentence-transformers model requires additional files to load at runtime (e.g., config.json, tokenizer.json, tokenizer_config.json, special_tokens_map.json, etc.). Without them, the hermetic build will produce a broken image.
Add entries for each required model file from the sentence-transformers/all-mpnet-base-v2 repository, each with its own download_url and checksum.
Run the following to identify which files the model repository contains:
#!/bin/bash
# Check what files the sentence-transformers/all-mpnet-base-v2 model requires
# by looking at how the non-hermetic path uses it
rg -n "all-mpnet-base-v2" -C 3
echo "---"
# Check the download script to understand what files are expected
fd -t f "download_embeddings_model" --exec cat {}🤖 Prompt for AI Agents
In `@artifacts.lock.yaml` around lines 4 - 7, The artifacts.lock currently only
includes "model.safetensors" so the hermetic build will miss other files needed
by sentence-transformers; update the artifacts list to add entries for every
required file from the sentence-transformers/all-mpnet-base-v2 repo (e.g.,
config.json, tokenizer.json, tokenizer_config.json, special_tokens_map.json,
vocab files or vocab.json, merges.txt if present, and any model index files),
making each a separate artifact entry with its own download_url and sha256
checksum (use the same download URL pattern and compute checksums), and ensure
you preserve the existing filename fields; run the suggested shell checks to
enumerate the repo files and verify you included each required filename in
artifacts with correct checksums.
| # Install Ruby Gems | ||
| RUN BUNDLE_PATH__SYSTEM=true bundle install |
There was a problem hiding this comment.
bundle install runs unconditionally — will fail in hermetic builds without network.
Line 25 runs bundle install before the Cachi2 conditional check. In a hermetic (network-isolated) build, this will attempt to reach rubygems.org and fail. Consider wrapping it in the same Cachi2 conditional or ensuring Cachi2 prefetch provides the gems before this step.
🔧 Suggested approach
# Install Ruby Gems
-RUN BUNDLE_PATH__SYSTEM=true bundle install
+RUN if [ -f /cachi2/cachi2.env ]; then \
+ . /cachi2/cachi2.env && \
+ BUNDLE_PATH__SYSTEM=true BUNDLE_FROZEN=1 bundle install; \
+ else \
+ BUNDLE_PATH__SYSTEM=true bundle install; \
+ fi🤖 Prompt for AI Agents
In `@Containerfile` around lines 24 - 25, The Dockerfile currently runs "RUN
BUNDLE_PATH__SYSTEM=true bundle install" unconditionally which will fail in
hermetic builds; modify the Dockerfile to either move that RUN line inside the
same Cachi2 conditional that protects other Cachi2 steps (so bundle install only
runs when Cachi2 is enabled) or ensure Cachi2 prefetching provides the gem cache
before this RUN by adding the bundle cache/prefetch step and using that cache as
the source for "bundle install" (reference the RUN BUNDLE_PATH__SYSTEM=true
bundle install step and the Cachi2-related conditional to locate where to
change).
| "pybuild-deps>=0.5.0", | ||
| "pip==24.3.1", |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
pip latest version release 2025 2026
💡 Result:
Latest pip (as of Feb 6, 2026): pip 26.0.1, released Feb 5, 2026. [1]
pip releases in 2026 (so far):
- 26.0.1 — Feb 5, 2026 [1]
- 26.0 — Jan 31, 2026 [1][2]
pip releases in 2025:
- 25.3 — Oct 25, 2025 [1]
- 25.2 — Jul 30, 2025 [1]
- 25.1.1 — May 2, 2025 [1]
- 25.1 — Apr 26, 2025 [1]
- 25.0.1 — Feb 9, 2025 [1]
- 25.0 — Jan 26, 2025 [1]
pip’s documented cadence is roughly quarterly (Jan/Apr/Jul/Oct), sometimes with additional patch releases. [3]
Sources: [1] PyPI release history, [2] pip 26.0 announcement, [3] pip release process documentation.
Update pip to the latest version or document the reason for pinning to 24.3.1.
pip==24.3.1 is over 13 months old; the current version is pip 26.0.1 (released Feb 5, 2026). Eight newer releases have been issued since 24.3.1, likely including security patches and bug fixes. If pinning to 24.3.1 is intentional for the hermetic build workflow, add a comment explaining why this specific version is required.
🤖 Prompt for AI Agents
In `@pyproject.toml` around lines 91 - 92, The pyproject.toml currently pins
"pip==24.3.1" which is outdated; either update this dependency to the latest
stable pip (replace "pip==24.3.1" with the current version) or add a clear
comment next to the "pip==24.3.1" entry explaining why that exact pip version is
required for the hermetic build (e.g., known compatibility, reproducibility, or
security exemption); ensure the change is made in pyproject.toml where the
"pip==24.3.1" entry appears and keep the "pybuild-deps" line untouched unless it
also needs aligning.
| uv pip compile pyproject.toml -o "$RAW_REQ_FILE" \ | ||
| --python-platform x86_64-unknown-linux-gnu \ | ||
| --python-version 3.12 \ | ||
| --refresh \ | ||
| --index https://console.redhat.com/api/pypi/public-rhai/rhoai/3.2/cpu-ubi9/simple/ \ | ||
| --default-index https://pypi.org/simple/ \ | ||
| --index-strategy unsafe-best-match \ | ||
| --emit-index-annotation \ | ||
| --no-sources \ | ||
| --override requirements.overrides.txt |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
uv pip compile --python-platform flag x86_64 architecture specific resolution
💡 Result:
uv pip compile can resolve dependencies as if it were running on a different CPU/OS by setting --python-platform to a target triple (CPU–vendor–OS), e.g. x86_64-unknown-linux-gnu (Linux x86_64) or aarch64-apple-darwin (macOS ARM64). This drives environment marker evaluation and which wheels are considered compatible. [1][2]
x86_64 “architecture-specific” resolution (Linux)
If your goal is “Linux x86_64”, you typically want one of these:
- Generic glibc Linux x86_64 (default “linux” alias):
--python-platform x86_64-unknown-linux-gnu(same as--python-platform linux) [2] - manylinux-tagged Linux x86_64 (often what wheel publishers target):
--python-platform x86_64-manylinux_2_28(or another manylinux version appropriate for your baseline) [3]
Example:
uv pip compile requirements.in \
--python-version 3.12 \
--python-platform x86_64-manylinux_2_28 \
-o requirements.txtWhy x86_64-unknown-linux-gnu can fail when a manylinux platform works
Some projects upload wheels that match manylinux tags but not what a resolver considers compatible with a plain x86_64-unknown-linux-gnu target in a given case. This exact pattern has been reported (e.g., onnxruntime==1.18 resolving with x86_64-manylinux_2_28 but failing with x86_64-unknown-linux-gnu). [4]
Important caveat
--python-platform is an advanced switch: uv will select wheels for the target platform, but anything it builds from source is built on the current machine and may not actually be usable on the target. Also, Python markers expose more platform detail than --python-platform can fully express, so uv does “best effort” compatibility for the chosen target. [2][5]
Sources:
[1] https://docs.astral.sh/uv/reference/settings/
[2] https://docs.astral.sh/uv/reference/cli/
[3] https://docs.astral.ac.cn/uv/reference/cli/ (manylinux target list)
[4] astral-sh/uv#3842
[5] https://docs.astral.sh/uv/concepts/resolution/
🌐 Web query:
uv tool pip compile multi-platform dependency resolution aarch64
💡 Result:
uv pip compile is platform-specific by default (like pip-tools), so a requirements.txt generated on x86_64 often won’t be correct/complete for Linux aarch64. (docs.astral.sh)
Option A (recommended): one lock for multiple platforms (includes aarch64)
Use universal resolution, which emits environment markers so one file can work across OS/arch/Python:
uv pip compile --universal -o requirements.txt requirements.in
# or from pyproject.toml:
uv pip compile --universal pyproject.toml -o requirements.txtUniversal resolution is also what uv.lock uses (via uv lock / uv sync) and is designed to be portable across platforms/architectures. (docs.astral.sh)
Option B: compile specifically for Linux aarch64
If you want a lock file just for aarch64, compile targeting that platform + exact Python version:
uv pip compile \
--python-platform aarch64-manylinux_2_28 \
--python-version 3.12 \
-o requirements-aarch64.txt \
requirements.inuv supports compiling for alternate platforms via --python-platform and --python-version. (docs.astral.sh)
Notes / gotchas
--python-platformis not justlinux; you often need a PEP 425/packaging-style platform tag (e.g.,aarch64-manylinux_2_28). Some packages resolve onmanylinuxtags but not on “generic” GNU tags, depending on how wheels are published. (github.com)- Linux aarch64 is a Tier 2 supported platform for
uv(prebuilt wheels are published, but it’s not continuously tested like Tier 1). (docs.astral.sh)
Citations:
- 1: https://docs.astral.sh/uv/concepts/resolution/?utm_source=openai
- 2: https://docs.astral.sh/uv/concepts/resolution/?utm_source=openai
- 3: https://docs.astral.sh/uv/concepts/resolution/?utm_source=openai
- 4: onnxruntime doesn't resolve using python-platform x86_64-unknown-linux-gnu astral-sh/uv#3842
- 5: https://docs.astral.sh/uv/reference/policies/platforms/?utm_source=openai
🌐 Web query:
uv --python-platform documentation universal flag
💡 Result:
In uv, there are two related (but different) ideas:
--python-platform (platform-specific resolution)
uv pip compile defaults to a platform-specific resolution. You can override the target platform with --python-platform (and optionally --python-version) to resolve as-if you were on another platform, e.g.:
uv pip compile --python-platform linux --python-version 3.10 requirements.in--python-platform is documented as a target triple (CPU–vendor–OS), e.g. x86_64-unknown-linux-gnu or aarch64-apple-darwin, with aliases like linux and windows. [1], [2]
--universal (universal / cross-platform resolution)
If you want a single requirements output that works across platforms, use:
uv pip compile --universal requirements.inThis produces a “universal” requirements file by emitting environment markers so platform-specific deps are conditional (e.g. Windows-only packages get ; sys_platform == 'win32'). This is also the resolution mode used for uv.lock. [1], [3]
Docs to read (most relevant):
- “Resolution” (explains platform-specific vs universal, and mentions both
--python-platformand--universal). [1] - “Settings → python-platform” (defines the target-triple format). [2]
Sources:
[1] https://docs.astral.sh/uv/concepts/resolution/
[2] https://docs.astral.sh/uv/reference/settings/
[3] https://docs.astral.sh/uv/guides/migration/pip-to-project/
🏁 Script executed:
head -n 100 scripts/konflux_requirements.shRepository: lightspeed-core/rag-content
Length of output: 5356
Dependency resolution is x86_64-only but builds target both aarch64 and x86_64.
--python-platform x86_64-unknown-linux-gnu resolves packages for x86_64 only. If any dependency differs per platform (different version, only available on aarch64, etc.), it will be missed. The --universal flag on line 84 (wheel hash compilation) adds environment markers for already-resolved packages but cannot discover platform-specific packages absent from the x86_64 resolution.
To support aarch64 builds, either:
- Use
--universalin the initial resolution (lines 24–33) to resolve all platform variants at once, or - Run the resolution step for both x86_64 and aarch64 separately and merge the results.
Note: lines 85–87 (source and pypi-wheels hash compilation) do not use --universal either, further limiting cross-platform coverage.
🤖 Prompt for AI Agents
In `@scripts/konflux_requirements.sh` around lines 24 - 33, The current uv pip
compile invocation (the command writing to "$RAW_REQ_FILE" with
--python-platform x86_64-unknown-linux-gnu) resolves dependencies for x86_64
only, missing aarch64-specific packages; either add --universal to this initial
resolution command to include all platform variants, or run the resolution step
twice (once with --python-platform x86_64-unknown-linux-gnu and once with
--python-platform aarch64-unknown-linux-gnu) and merge the results into
"$RAW_REQ_FILE"; also apply the same --universal or per-platform approach to the
later wheel/hash compilation steps (the wheel hash compilation referenced by the
--universal flag on the later block) so platform-specific wheels are discovered.
b11dd85 to
6e8c287
Compare
There was a problem hiding this comment.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
.tekton/rag-tool-push.yaml (1)
27-29:⚠️ Potential issue | 🟡 MinorPush pipeline prefetches arm64 wheels but only builds x86_64.
build-platforms(line 29) lists onlylinux/x86_64, but the prefetchbinary.arch(line 63) is"x86_64,aarch64". This means arm64 wheels are fetched and discarded, adding unnecessary latency to the push build. Either addlinux-c6gd2xlarge/arm64to push build-platforms (as done in the PR YAML), or scope the prefetch arch to"x86_64"only.Also applies to: 62-63
🧹 Nitpick comments (1)
build-args-konflux.conf (1)
1-2: Consider pinningBASE_IMAGEto a digest for fully reproducible builds.The tag
:3.2is a floating reference. If Konflux doesn't automatically resolve and pin the base image digest, successive builds could silently pick up different image contents. Pinning to a digest (e.g.,@sha256:…) would strengthen hermeticity. If Konflux handles base-image pinning separately, this is fine as-is.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@requirements.hashes.wheel.pypi.txt`:
- Around line 1-10: The container build must copy and use the generated
hash-locked requirements file; update the Containerfile to COPY
requirements.hashes.wheel.pypi.txt into the image and change the pip install
step to install from that file using --require-hashes (e.g., pip install
--require-hashes -r requirements.hashes.wheel.pypi.txt), ensuring the index
URL/comment in the file is preserved and no-deps semantics match the build
policy.
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@Containerfile`:
- Around line 63-65: The Dockerfile currently calls groupadd and useradd (in the
RUN that executes groupadd -r rag -g 1000 && useradd -r -u 1000 -g rag ...) but
the base image lacks those utilities; add shadow-utils to the earlier package
installation step so groupadd/useradd are available (update the RUN that
installs packages to include shadow-utils alongside python3.12,
python3.12-devel, python3.12-pip, git, rubygems, rubygem-bundler), then rebuild
to ensure the subsequent RUN using groupadd/useradd succeeds.
- Line 19: The Containerfile is missing requirements.hashes.wheel.pypi.txt in
the build context and install step; update the COPY statement that currently
lists Makefile pyproject.toml uv.lock README.md Gemfile Gemfile.lock
requirements.hashes.wheel.txt requirements.hashes.source.txt
requirements-build.txt ./ to also include requirements.hashes.wheel.pypi.txt,
and update the pip install command that installs from
requirements.hashes.wheel.txt and requirements.hashes.source.txt to also install
from requirements.hashes.wheel.pypi.txt (i.e., add -r
requirements.hashes.wheel.pypi.txt alongside the other -r entries).
In `@pyproject.toml`:
- Around line 49-53: Resolve the version mismatch between the pyproject
constraint "torch>=2.8.0,<2.9.0" and the override "torch==2.9.0" in
requirements.overrides.txt: either relax the pyproject constraint to accept 2.9
(e.g., change "torch>=2.8.0,<2.10.0") or update the override to a 2.8.x pin
(e.g., "torch==2.8.x") so both files specify compatible ranges; ensure the
chosen change is applied consistently and run the dependency resolver to verify
no further conflicts.
| WORKDIR /rag-content | ||
|
|
||
| COPY Makefile pyproject.toml uv.lock README.md ./ | ||
| COPY Makefile pyproject.toml uv.lock README.md Gemfile Gemfile.lock requirements.hashes.wheel.txt requirements.hashes.source.txt requirements-build.txt ./ |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check if requirements files exist and their contents
echo "=== Available requirements files ==="
fd -t f "requirements" | sort
echo ""
echo "=== Check requirements.hermetic.txt content ==="
if [ -f requirements.hermetic.txt ]; then
cat requirements.hermetic.txt
else
echo "requirements.hermetic.txt does not exist"
fi
echo ""
echo "=== Check requirements.hashes.wheel.pypi.txt content ==="
if [ -f requirements.hashes.wheel.pypi.txt ]; then
cat requirements.hashes.wheel.pypi.txt
else
echo "requirements.hashes.wheel.pypi.txt does not exist"
fi
echo ""
echo "=== Relevant lines in Containerfile ==="
cat -n Containerfile | sed -n '15,50p'
echo ""
echo "=== Check konflux_requirements.sh ==="
if [ -f konflux_requirements.sh ]; then
echo "File found. First 100 lines:"
head -100 konflux_requirements.sh
else
echo "konflux_requirements.sh not found"
fiRepository: lightspeed-core/rag-content
Length of output: 2810
Add requirements.hashes.wheel.pypi.txt to both the COPY and pip install commands.
The file requirements.hashes.wheel.pypi.txt contains PyPI-only wheel packages (sqlite-vec) and is currently missing from the image build. Line 19 does not copy it, and line 41 does not install from it. The file requirements.hermetic.txt covers only build tools (uv, pip, ninja) and does not include these packages. Add requirements.hashes.wheel.pypi.txt to the COPY statement on line 19 and to the pip install command on line 41 (alongside requirements.hashes.wheel.txt and requirements.hashes.source.txt).
🤖 Prompt for AI Agents
In `@Containerfile` at line 19, The Containerfile is missing
requirements.hashes.wheel.pypi.txt in the build context and install step; update
the COPY statement that currently lists Makefile pyproject.toml uv.lock
README.md Gemfile Gemfile.lock requirements.hashes.wheel.txt
requirements.hashes.source.txt requirements-build.txt ./ to also include
requirements.hashes.wheel.pypi.txt, and update the pip install command that
installs from requirements.hashes.wheel.txt and requirements.hashes.source.txt
to also install from requirements.hashes.wheel.pypi.txt (i.e., add -r
requirements.hashes.wheel.pypi.txt alongside the other -r entries).
| RUN groupadd -r rag -g 1000 && \ | ||
| useradd -r -u 1000 -g rag -d /rag-content -s /sbin/nologin rag && \ | ||
| chown -R rag:rag /rag-content |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check if shadow-utils is mentioned anywhere
echo "=== Checking if shadow-utils is mentioned anywhere ==="
rg -n "shadow-utils" --glob "*.yaml" --glob "*.conf" --glob "Containerfile" --glob "Dockerfile"
echo ""
echo "=== Check for rpms.in files ==="
fd -t f "rpms.in" --exec cat {}
echo ""
echo "=== Display full Containerfile to understand context ==="
cat -n ContainerfileRepository: lightspeed-core/rag-content
Length of output: 4372
Install shadow-utils before creating user/group.
ubi9/ubi-minimal does not include shadow-utils by default. The groupadd and useradd commands at lines 63-65 will fail during build because these tools are not available in the base image. Add shadow-utils to the package installation at lines 9-10:
RUN ${DNF_COMMAND} install -y --nodocs --setopt=keepcache=0 --setopt=tsflags=nodocs \
python3.12 python3.12-devel python3.12-pip git \
rubygems rubygem-bundler shadow-utils && \
${DNF_COMMAND} clean all
🤖 Prompt for AI Agents
In `@Containerfile` around lines 63 - 65, The Dockerfile currently calls groupadd
and useradd (in the RUN that executes groupadd -r rag -g 1000 && useradd -r -u
1000 -g rag ...) but the base image lacks those utilities; add shadow-utils to
the earlier package installation step so groupadd/useradd are available (update
the RUN that installs packages to include shadow-utils alongside python3.12,
python3.12-devel, python3.12-pip, git, rubygems, rubygem-bundler), then rebuild
to ensure the subsequent RUN using groupadd/useradd succeeds.
| # torch 2.8.x pairs with torchvision 0.23.x | ||
| "llama-stack==0.4.3", | ||
| "llama-stack-client==0.4.3", | ||
| "torch>=2.8.0,<2.9.0", | ||
| "torchvision>=0.23.0,<0.24.0", |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check the torch version in both files
echo "=== pyproject.toml torch constraint ==="
grep -n "torch" pyproject.toml | head -20
echo ""
echo "=== requirements.overrides.txt torch pin ==="
grep -n "torch" requirements.overrides.txt
echo ""
echo "=== Check pip version in pyproject.toml ==="
grep -n "pip" pyproject.tomlRepository: lightspeed-core/rag-content
Length of output: 581
🏁 Script executed:
#!/bin/bash
# Additional verification: Check if there are any recent changes or context
echo "=== Full requirements.overrides.txt context ==="
cat -n requirements.overrides.txtRepository: lightspeed-core/rag-content
Length of output: 761
torch version conflict with requirements.overrides.txt.
pyproject.toml constrains torch to >=2.8.0,<2.9.0 (line 52), but requirements.overrides.txt pins torch==2.9.0. The override version 2.9.0 falls outside the upper bound <2.9.0 declared here. Depending on the resolver's override semantics, this may either silently ignore the constraint or cause a resolution failure. Align the two files — either widen the range here (e.g., <2.10.0) or lower the override to a 2.8.x release.
🤖 Prompt for AI Agents
In `@pyproject.toml` around lines 49 - 53, Resolve the version mismatch between
the pyproject constraint "torch>=2.8.0,<2.9.0" and the override "torch==2.9.0"
in requirements.overrides.txt: either relax the pyproject constraint to accept
2.9 (e.g., change "torch>=2.8.0,<2.10.0") or update the override to a 2.8.x pin
(e.g., "torch==2.8.x") so both files specify compatible ranges; ensure the
chosen change is applied consistently and run the dependency resolver to verify
no further conflicts.
Signed-off-by: Haoyu Sun <hasun@redhat.com>
Description
Enable hermetic build.
Tools to regenerate requirement files.
Ruby gems are installed using bundler, for Konflux only support bundler to install Ruby gems in hermetic builds.
Type of change
Tools used to create PR
Identify any AI code assistants used in this PR (for transparency and review context)
Related Tickets & Documents
Checklist before requesting a review
Testing
Summary by CodeRabbit
New Features
Chores