feat(docs): add Rust + C++ SDK auto-doc templates#43
Conversation
Brings two more source repos into the same auto-doc pipeline as
TypeScript/Python/.NET:
- `resq-software/crates` (Rust workspace, 11 crates): new template
emits one stub page per crate with metadata + embedded README +
link to docs.rs (the canonical Rust API host). docs.rs already
serves rustdoc for every published crate; duplicating it in
Mintlify would split source-of-truth, so we link out instead.
- `resq-software/vcpkg` (C++ via CMake): new template runs Doxygen
against each package's include/ headers, then converts the XML
to Markdown with moxygen. Same MDX-escape pass as the Python
template handles `<` / `{` / `}` in C++ template syntax.
Both templates follow the established structure (resolve ref →
generate → escape → write README → build pages index → splice into
docs.json → open auto-PR) and write into add-paths
sdks/<lang>/api/** plus docs.json.
Also:
- automation/sync-templates.sh: add rust + cpp targets; fix new-file
detection (git diff doesn't report untracked files as a diff, so
the prior version reported "up-to-date" for repos that had no
workflow at all).
- sdks/cpp.mdx: new C++ SDK landing page in the Languages group.
- docs.json: add sdks/cpp to the Languages nav group.
The Rust and C++ sub-groups under Generated Package References will
materialize after each source repo's workflow runs for the first
time (the splice step's `else: append` branch handles creation).
|
Warning Rate limit exceeded
You’ve run out of usage credits. Purchase more in the billing tab. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (2)
📝 WalkthroughWalkthroughThis pull request establishes automated API documentation generation and publishing for C++ and Rust languages. It adds two new GitHub Actions workflows that generate reference documentation, integrates them with an updated template synchronization script, introduces a C++ SDK documentation page with installation and build instructions, and wires the C++ SDK into the docs navigation structure. ChangesC++ and Rust SDK documentation infrastructure
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Possibly related PRs
Suggested labels
Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Tip 💬 Introducing Slack Agent: The best way for teams to turn conversations into code.Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.
Built for teams:
One agent for your entire SDLC. Right inside Slack. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Code Review
This pull request introduces automated API documentation workflows for C++ and Rust SDKs, updates the central documentation navigation, and improves the template synchronization script. Feedback focuses on correcting the MDX escaping logic for "<" characters (which is currently inverted), refining broad regex patterns used for cleaning breadcrumbs, and improving the robustness of metadata extraction from Cargo.toml files, specifically regarding workspace inheritance and version parsing.
- MDX `<` escape: invert the pass-through rule for the new templates. Python keeps `<X>` for `<a id="...">` anchors emitted by pydoc-markdown, but Rust/C++ prose contains bare `Result<T>` / `std::vector<int>` references that JSX-parse and fail. Escape all `<` in prose; only HTML comments (`<!--`) and closing tags (`</`) pass through. - moxygen breadcrumb stripping: replace the over-broad sed that matched any `[..](..md)` line + paragraph with a Python pass that removes only leading breadcrumb lines (line 1 + any blank lines immediately after). Preserves cross-references in document body. - Rust workspace metadata extraction: anchor the section header match to start-of-line so `[dependencies.foo]` doesn't get treated as `[foo]`. Section parsing now has one helper used for both workspace.package and per-crate package tables. - Rust `license` inheritance: handle `license.workspace = true` by falling through to `[workspace.package].license` when a crate doesn't pin a string license.
There was a problem hiding this comment.
Actionable comments posted: 6
🧹 Nitpick comments (3)
sdks/cpp.mdx (1)
6-11: ⚡ Quick winRewrite intro in second person and direct voice.
Line 6-11 currently describes the SDK in third person. Please rewrite to address the reader as “you.”
Proposed rewrite
-The ResQ C++ libraries at -[`resq-software/vcpkg`](https://github.com/resq-software/vcpkg) ship as -header-only and static-library packages installable through the -[vcpkg](https://vcpkg.io) package manager. They are intended for -performance-sensitive components that integrate with the ResQ platform -from native code. +You can use the ResQ C++ libraries in +[`resq-software/vcpkg`](https://github.com/resq-software/vcpkg) as +header-only and static-library packages through +[vcpkg](https://vcpkg.io). Use them when you need performance-sensitive +native integration with the ResQ platform.As per coding guidelines, "Use active voice and second person ("you") in documentation writing."
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@sdks/cpp.mdx` around lines 6 - 11, Rewrite the introductory paragraph about the ResQ C++ libraries from third person to direct second person and active voice: address the reader as “you,” use active verbs, and frame benefits and actions for the reader (e.g., change “They are intended for performance-sensitive components that integrate with the ResQ platform from native code” to something like “Use these header-only and static-library packages via vcpkg when you need high-performance native integration with the ResQ platform”). Update the opening lines that mention resq-software/vcpkg, vcpkg, and the package types so the sentence directly tells the reader what to do and why (install via vcpkg, intended use cases, and advantages).automation/source-repo-templates/api-docs.cpp.yml (2)
332-340: 💤 Low valueFailures here surface as bare
StopIteration.If
docs.jsonever loses theenlanguage, theSDKstab, or theGenerated Package Referencesgroup, eachnext(...)raisesStopIterationwith no context — making CI failures hard to triage. Easy to wrap with explicit errors.♻️ Proposed fix
- en = next(l for l in docs["navigation"]["languages"] if l["language"] == "en") - sdks_tab = next(t for t in en["tabs"] if t["tab"] == "SDKs") - gen_group = next(g for g in sdks_tab["groups"] if g["group"] == "Generated Package References") + def _find(items, key, value, label): + for it in items: + if it.get(key) == value: + return it + raise SystemExit(f"docs.json: no {label} where {key}={value!r}") + + en = _find(docs["navigation"]["languages"], "language", "en", "language") + sdks_tab = _find(en["tabs"], "tab", "SDKs", "tab") + gen_group = _find(sdks_tab["groups"], "group", + "Generated Package References", "group")🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@automation/source-repo-templates/api-docs.cpp.yml` around lines 332 - 340, The three uses of next(...) for en, sdks_tab, and gen_group can raise bare StopIteration; change them to safer lookups that raise informative errors: replace next(l for l in docs["navigation"]["languages"] if l["language"] == "en") with a guarded lookup that checks docs["navigation"]["languages"] and raises a clear ValueError if 'en' is missing (and similarly for the SDKs tab and the Generated Package References group), so that variables en, sdks_tab, and gen_group validate presence and throw descriptive errors instead of bare StopIteration before proceeding to the loop that updates gen_group["pages"].
137-145: 💤 Low valueBreadcrumb-stripping
sedrange can over-delete.
sed -i -E '/^\[.*\]\(.*\.md\)$/,/^$/d'deletes from any line that looks like a bare[text](something.md)link through the next blank line. If a moxygen page ends with such a link without a trailing blank line, this deletes to EOF; if a similar pattern appears mid-page (e.g., a "See also" line generated as a bare link), the following paragraph also disappears. Anchoring to the first matching line only — or using a one-linedplus a separate blank-line cleanup — is safer.♻️ Tighter alternative
- find . -type f -name '*.md' -print0 | while IFS= read -r -d '' f; do - sed -i -E '/^\[.*\]\(.*\.md\)$/,/^$/d' "$f" || true - done + find . -type f -name '*.md' -print0 | while IFS= read -r -d '' f; do + # Drop only the first breadcrumb line and a single immediately- + # following blank, leaving the rest of the page untouched. + awk 'NR==1 && /^\[.*\]\(.*\.md\)$/ { skip_blank=1; next } + skip_blank && /^$/ { skip_blank=0; next } + { skip_blank=0; print }' "$f" > "$f.tmp" && mv "$f.tmp" "$f" + done🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@automation/source-repo-templates/api-docs.cpp.yml` around lines 137 - 145, The current "Strip moxygen breadcrumbs" step uses a multi-line sed range that can delete to EOF; change the loop to delete only the single matching breadcrumb line and then run a separate blank-line cleanup: in the find ... | while ... loop replace sed -i -E '/^\[.*\]\(.*\.md\)$/,/^$/d' "$f" with two commands — first sed -i -E '/^\[[:space:]]*\[.*\]\(.*\.md\)[[:space:]]*$/d' "$f" to remove only the breadcrumb line, and then a second sed (or awk) to collapse/remove extra blank lines/trailing blank lines (e.g. sed -i -E ':a;N;$!ba;s/\n{3,}/\n\n/g; s/\n$//') to avoid leaving or removing content to EOF; keep these replacements inside the same find ... while loop so the working-directory and file iteration remain unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
Inline comments:
In `@automation/source-repo-templates/api-docs.cpp.yml`:
- Around line 65-74: The workflow step "Install Doxygen + moxygen" uses a
nonexistent package version (npm install --location=global moxygen@0.10.0);
update that npm install command to use a valid moxygen release (for example
replace moxygen@0.10.0 with moxygen@0.8.0 or a confirmed 2.x version such as
moxygen@2.1.7) so the job no longer fails with a 404 during npm install.
- Around line 46-57: The workflow currently interpolates
inputs.ref/github.ref_name directly into the shell via raw='${{ inputs.ref ||
github.ref_name }}', which allows script injection; change it to set the ref
value as an environment variable (e.g. via env: DOCS_RAW: "${{ inputs.ref ||
github.ref_name }}" or similar) and then read it inside the shell with
raw="$DOCS_RAW" before performing the slug transformations and exporting
DOCS_REF_NAME/DOCS_REF_SLUG, and apply the same env-based fix to the
corresponding block in api-docs.rust.yml so no user-controlled string is
injected directly into the shell.
In `@automation/source-repo-templates/api-docs.rust.yml`:
- Around line 41-54: The "Resolve ref metadata" step uses inline GitHub
interpolation in the run script (raw='${{ inputs.ref || github.ref_name }}')
which can allow script injection; change to pass the ref via the step's env
block (e.g., set an env key like RAW_REF: ${{ inputs.ref || github.ref_name }}),
then in the run script read raw="$RAW_REF", proceed to strip refs/tags/ and
refs/heads/ and compute slug, and export DOCS_REF_NAME and DOCS_REF_SLUG to
GITHUB_ENV; update references to the variables raw and slug accordingly so no
user-controlled value is directly interpolated into the script.
In `@automation/source-repo-templates/README.md`:
- Line 69: Update the tooling table row that currently lists README so the
filename is formatted as inline code; locate the row containing
"`resq-software/crates`" and change the plain README entry to use code
formatting (e.g., backtick-wrapped README) so filenames in
automation/source-repo-templates/README.md follow the project's code-formatting
convention.
In `@sdks/cpp.mdx`:
- Line 20: The navigation label "Generated Package References → C++" should be
formatted as a bold UI reference; update the text string "Doxygen comments and
lives under Generated Package References → C++." so that the UI element portion
reads **Generated Package References → C++** (i.e., wrap that navigation label
in markdown bold markers) to match the guideline for bolding UI element
references.
---
Nitpick comments:
In `@automation/source-repo-templates/api-docs.cpp.yml`:
- Around line 332-340: The three uses of next(...) for en, sdks_tab, and
gen_group can raise bare StopIteration; change them to safer lookups that raise
informative errors: replace next(l for l in docs["navigation"]["languages"] if
l["language"] == "en") with a guarded lookup that checks
docs["navigation"]["languages"] and raises a clear ValueError if 'en' is missing
(and similarly for the SDKs tab and the Generated Package References group), so
that variables en, sdks_tab, and gen_group validate presence and throw
descriptive errors instead of bare StopIteration before proceeding to the loop
that updates gen_group["pages"].
- Around line 137-145: The current "Strip moxygen breadcrumbs" step uses a
multi-line sed range that can delete to EOF; change the loop to delete only the
single matching breadcrumb line and then run a separate blank-line cleanup: in
the find ... | while ... loop replace sed -i -E '/^\[.*\]\(.*\.md\)$/,/^$/d'
"$f" with two commands — first sed -i -E
'/^\[[:space:]]*\[.*\]\(.*\.md\)[[:space:]]*$/d' "$f" to remove only the
breadcrumb line, and then a second sed (or awk) to collapse/remove extra blank
lines/trailing blank lines (e.g. sed -i -E ':a;N;$!ba;s/\n{3,}/\n\n/g; s/\n$//')
to avoid leaving or removing content to EOF; keep these replacements inside the
same find ... while loop so the working-directory and file iteration remain
unchanged.
In `@sdks/cpp.mdx`:
- Around line 6-11: Rewrite the introductory paragraph about the ResQ C++
libraries from third person to direct second person and active voice: address
the reader as “you,” use active verbs, and frame benefits and actions for the
reader (e.g., change “They are intended for performance-sensitive components
that integrate with the ResQ platform from native code” to something like “Use
these header-only and static-library packages via vcpkg when you need
high-performance native integration with the ResQ platform”). Update the opening
lines that mention resq-software/vcpkg, vcpkg, and the package types so the
sentence directly tells the reader what to do and why (install via vcpkg,
intended use cases, and advantages).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 39fcb95d-0926-4126-94eb-7d7202d7e91a
📒 Files selected for processing (6)
automation/source-repo-templates/README.mdautomation/source-repo-templates/api-docs.cpp.ymlautomation/source-repo-templates/api-docs.rust.ymlautomation/sync-templates.shdocs.jsonsdks/cpp.mdx
| - name: Resolve ref metadata | ||
| # Single source of truth for the ref this run documents. | ||
| run: | | ||
| raw='${{ inputs.ref || github.ref_name }}' | ||
| raw="${raw#refs/tags/}" | ||
| raw="${raw#refs/heads/}" | ||
| slug="${raw//\//-}" | ||
| slug="${slug//\@/-}" | ||
| { | ||
| echo "DOCS_REF_NAME=$raw" | ||
| echo "DOCS_REF_SLUG=$slug" | ||
| } >> "$GITHUB_ENV" |
There was a problem hiding this comment.
Script injection: don't interpolate inputs.ref / github.ref_name directly into the shell.
raw='${{ inputs.ref || github.ref_name }}' substitutes the expression into the script before the shell parses it. Per GitHub's docs, ref and similar context fields are explicitly in the list of attacker-controllable values that should not flow directly into workflows where they could be interpreted as executable code. A tag like v1'; curl evil; ' (or any workflow_dispatch input) breaks out of the single-quoted string and runs arbitrary commands with DOCS_REPO_PR_TOKEN in scope. The same pattern exists in api-docs.rust.yml (lines 41-54) — fix both.
Pass the value through env: (which the shell treats as a string) instead:
🔒 Proposed fix
- name: Resolve ref metadata
# Single source of truth for the ref this run documents.
+ env:
+ INPUT_REF: ${{ inputs.ref }}
+ DEFAULT_REF: ${{ github.ref_name }}
run: |
- raw='${{ inputs.ref || github.ref_name }}'
+ raw="${INPUT_REF:-$DEFAULT_REF}"
raw="${raw#refs/tags/}"
raw="${raw#refs/heads/}"
slug="${raw//\//-}"
slug="${slug//\@/-}"
{
echo "DOCS_REF_NAME=$raw"
echo "DOCS_REF_SLUG=$slug"
} >> "$GITHUB_ENV"🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@automation/source-repo-templates/api-docs.cpp.yml` around lines 46 - 57, The
workflow currently interpolates inputs.ref/github.ref_name directly into the
shell via raw='${{ inputs.ref || github.ref_name }}', which allows script
injection; change it to set the ref value as an environment variable (e.g. via
env: DOCS_RAW: "${{ inputs.ref || github.ref_name }}" or similar) and then read
it inside the shell with raw="$DOCS_RAW" before performing the slug
transformations and exporting DOCS_REF_NAME/DOCS_REF_SLUG, and apply the same
env-based fix to the corresponding block in api-docs.rust.yml so no
user-controlled string is injected directly into the shell.
| - name: Install Doxygen + moxygen | ||
| # Doxygen ships in apt and is the source-of-truth XML | ||
| # extractor; moxygen is an npm package that converts the | ||
| # XML to one markdown file per class/group. Pinning npm | ||
| # globally is acceptable here because the runner is | ||
| # disposable. | ||
| run: | | ||
| sudo apt-get update -qq | ||
| sudo apt-get install -yqq doxygen graphviz | ||
| npm install --location=global moxygen@0.10.0 |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Confirm moxygen 0.10.0 exists on the public npm registry.
curl -s https://registry.npmjs.org/moxygen | jq '.versions | keys'
curl -s https://registry.npmjs.org/moxygen/0.10.0 | jq '{name, version, error}'Repository: resq-software/docs
Length of output: 402
Update moxygen version — 0.10.0 does not exist on the npm registry.
The public npm registry does not contain version 0.10.0 of moxygen. Available versions are 0.5.1 through 0.8.0, then 1.0.0 and later (up to 2.1.7). The workflow will fail at npm install --location=global moxygen@0.10.0 with a 404 error. Use an available version such as 0.8.0 (latest 0.x) or a newer 2.x version if compatibility is confirmed.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@automation/source-repo-templates/api-docs.cpp.yml` around lines 65 - 74, The
workflow step "Install Doxygen + moxygen" uses a nonexistent package version
(npm install --location=global moxygen@0.10.0); update that npm install command
to use a valid moxygen release (for example replace moxygen@0.10.0 with
moxygen@0.8.0 or a confirmed 2.x version such as moxygen@2.1.7) so the job no
longer fails with a 404 during npm install.
| - name: Resolve ref metadata | ||
| # Single source of truth for the ref this run documents. | ||
| # workflow_dispatch can pass an alternate ref via inputs.ref; | ||
| # fall back to github.ref_name (already stripped of refs/...). | ||
| run: | | ||
| raw='${{ inputs.ref || github.ref_name }}' | ||
| raw="${raw#refs/tags/}" | ||
| raw="${raw#refs/heads/}" | ||
| slug="${raw//\//-}" | ||
| slug="${slug//\@/-}" | ||
| { | ||
| echo "DOCS_REF_NAME=$raw" | ||
| echo "DOCS_REF_SLUG=$slug" | ||
| } >> "$GITHUB_ENV" |
There was a problem hiding this comment.
Same script-injection issue as api-docs.cpp.yml.
This step has the identical raw='${{ inputs.ref || github.ref_name }}' pattern. See the comment on api-docs.cpp.yml lines 46-57 for the rationale and fix — apply the same env: indirection here.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@automation/source-repo-templates/api-docs.rust.yml` around lines 41 - 54, The
"Resolve ref metadata" step uses inline GitHub interpolation in the run script
(raw='${{ inputs.ref || github.ref_name }}') which can allow script injection;
change to pass the ref via the step's env block (e.g., set an env key like
RAW_REF: ${{ inputs.ref || github.ref_name }}), then in the run script read
raw="$RAW_REF", proceed to strip refs/tags/ and refs/heads/ and compute slug,
and export DOCS_REF_NAME and DOCS_REF_SLUG to GITHUB_ENV; update references to
the variables raw and slug accordingly so no user-controlled value is directly
interpolated into the script.
| ws = pathlib.Path("Cargo.toml").read_text(encoding="utf-8") | ||
| ws_version_m = re.search( | ||
| r"\[workspace\.package\][^\[]*?version\s*=\s*\"([^\"]+)\"", | ||
| ws, | ||
| re.S, | ||
| ) | ||
| ws_version = ws_version_m.group(1) if ws_version_m else "unknown" | ||
|
|
||
| def parse_crate_meta(cargo_toml: pathlib.Path) -> dict: | ||
| text = cargo_toml.read_text(encoding="utf-8") | ||
| pkg_m = re.search(r"\[package\](.*?)(?=^\[|\Z)", text, re.S | re.M) | ||
| pkg = pkg_m.group(1) if pkg_m else "" | ||
|
|
||
| def field(name: str) -> str: | ||
| m = re.search(rf'^\s*{name}\s*=\s*"([^"]+)"', pkg, re.M) | ||
| return m.group(1) if m else "" | ||
|
|
||
| return { | ||
| "name": field("name"), | ||
| "version": field("version") or ws_version, | ||
| "description": field("description"), | ||
| "license": field("license") or "Apache-2.0", | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Is tomllib available in the python3 shipped on ubuntu-latest GitHub Actions runner (Ubuntu 24.04)?
💡 Result:
Yes, tomllib is available in the python3 shipped on the ubuntu-latest GitHub Actions runner (Ubuntu 24.04). Ubuntu 24.04 ships with Python 3.12 as the default python3[1][2][3]. The tomllib module was added to Python's standard library in version 3.11[4][5][6]. Since 3.12 >= 3.11, tomllib is included. The ubuntu-latest runner uses the Ubuntu 24.04 image[7][8], which lists Python 3.12.3 as installed[2].
Citations:
- 1: https://documentation.ubuntu.com/release-notes/24.04/
- 2: https://github.com/actions/runner-images/blob/main/images/ubuntu/Ubuntu2404-Readme.md
- 3: https://linuxvox.com/blog/ubuntu-2404-python-version/
- 4: https://docs.python.org/3/library/tomllib.html
- 5: https://docs.python.org/3/whatsnew/3.11.html
- 6: https://peps.python.org/pep-0680/
- 7: Ubuntu 24.04 is now available actions/runner-images#9848
- 8: Ubuntu-latest workflows will use Ubuntu-24.04 image actions/runner-images#10636
🏁 Script executed:
fd -t f "api-docs.rust.yml" --exec wc -l {} \;Repository: resq-software/docs
Length of output: 119
🏁 Script executed:
cat -n automation/source-repo-templates/api-docs.rust.yml | sed -n '80,110p'Repository: resq-software/docs
Length of output: 1679
🏁 Script executed:
head -15 automation/source-repo-templates/api-docs.rust.yml | cat -nRepository: resq-software/docs
Length of output: 665
🏁 Script executed:
cat -n automation/source-repo-templates/api-docs.rust.yml | sed -n '104,125p'Repository: resq-software/docs
Length of output: 1309
🏁 Script executed:
cat -n automation/source-repo-templates/api-docs.rust.yml | sed -n '60,85p'Repository: resq-software/docs
Length of output: 1369
Use tomllib instead of regex to parse Cargo.toml.
The current regex-based approach in field_in() (lines 96–102) only matches double-quoted, single-line scalar values. It silently misses:
- single-quoted literal strings (
name = 'foo'— valid TOML) - triple-quoted / multi-line strings (common for
description) - values with inline comments after the closing quote
description.workspace = true(onlyversion.workspace = truehas a fallback)
tomllib is in the Python 3.11+ stdlib and ubuntu-latest ships Python 3.12, so this is a zero-dependency drop-in that eliminates a class of "missing description / wrong version" bugs at generation time.
♻️ Proposed fix
import os
import pathlib
import re
+ import tomllib
out_root = pathlib.Path(os.environ["OUTPUT_DIR"])
out_root.mkdir(parents=True, exist_ok=True)
- # Workspace fallbacks for fields a crate may inherit via
- # `<field>.workspace = true`. Section-scoped extraction (only
- # read keys inside [workspace.package]) avoids matching
- # `version` lines inside other tables further down the file.
- def section(text: str, header: str) -> str:
- # Match from the [section] header to the next [...]
- # header (or end of file). Anchored to start-of-line so
- # `[dependencies.foo]` doesn't match `[foo]`.
- m = re.search(
- rf"^\[{re.escape(header)}\]\s*\n(.*?)(?=^\[|\Z)",
- text,
- re.S | re.M,
- )
- return m.group(1) if m else ""
-
- def field_in(section_text: str, name: str) -> str:
- m = re.search(
- rf'^\s*{name}\s*=\s*"([^"]+)"',
- section_text,
- re.M,
- )
- return m.group(1) if m else ""
-
- ws_text = pathlib.Path("Cargo.toml").read_text(encoding="utf-8")
- ws_pkg = section(ws_text, "workspace.package")
- ws_version = field_in(ws_pkg, "version") or "unknown"
- ws_license = field_in(ws_pkg, "license") or "Apache-2.0"
+ # Workspace fallbacks for fields a crate may inherit via
+ # `<field>.workspace = true`.
+ ws_data = tomllib.loads(
+ pathlib.Path("Cargo.toml").read_text(encoding="utf-8")
+ )
+ ws_pkg = ws_data.get("workspace", {}).get("package", {})
+ ws_version = ws_pkg.get("version", "unknown")
+ ws_license = ws_pkg.get("license", "Apache-2.0")
def parse_crate_meta(cargo_toml: pathlib.Path) -> dict:
- text = cargo_toml.read_text(encoding="utf-8")
- pkg = section(text, "package")
+ data = tomllib.loads(cargo_toml.read_text(encoding="utf-8"))
+ pkg = data.get("package", {})
+ def resolve(key, default=""):
+ v = pkg.get(key)
+ if isinstance(v, dict) and v.get("workspace") is True:
+ return ws_pkg.get(key, default)
+ return v if isinstance(v, str) else default
return {
- "name": field_in(pkg, "name"),
- "version": field_in(pkg, "version") or ws_version,
- "description": field_in(pkg, "description"),
- # Cargo allows `license.workspace = true` for
- # inheritance — when the crate doesn't pin a string
- # license, fall through to the workspace value.
- "license": field_in(pkg, "license") or ws_license,
+ "name": resolve("name"),
+ "version": resolve("version", ws_version),
+ "description": resolve("description"),
+ "license": resolve("license", ws_license),
}| | `resq-software/dotnet-sdk` | C# | DefaultDocumentation | `api-docs.dotnet.yml` | | ||
| | `resq-software/pypi` | Python | lazydocs | `api-docs.python.yml` | | ||
| | `resq-software/pypi` | Python | pydoc-markdown | `api-docs.python.yml` | | ||
| | `resq-software/crates` | Rust | README + docs.rs links | `api-docs.rust.yml` | |
There was a problem hiding this comment.
Format filename as code in the tooling table.
At Line 69, README should use inline code formatting to match doc conventions for filenames.
Proposed fix
-| `resq-software/crates` | Rust | README + docs.rs links | `api-docs.rust.yml` |
+| `resq-software/crates` | Rust | `README` + docs.rs links | `api-docs.rust.yml` |As per coding guidelines, "Use code formatting for file names, commands, and paths."
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| | `resq-software/crates` | Rust | README + docs.rs links | `api-docs.rust.yml` | | |
| | `resq-software/crates` | Rust | `README` + docs.rs links | `api-docs.rust.yml` | |
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@automation/source-repo-templates/README.md` at line 69, Update the tooling
table row that currently lists README so the filename is formatted as inline
code; locate the row containing "`resq-software/crates`" and change the plain
README entry to use code formatting (e.g., backtick-wrapped README) so filenames
in automation/source-repo-templates/README.md follow the project's
code-formatting convention.
| <Note> | ||
| Each package's headers live under `packages/<pkg>/include/resq/`. | ||
| Detailed type and function reference is regenerated from the headers' | ||
| Doxygen comments and lives under Generated Package References → C++. |
There was a problem hiding this comment.
Bold the navigation label used as a UI reference.
At Line 20, format the navigation label as bold text.
Proposed fix
- Doxygen comments and lives under Generated Package References → C++.
+ Doxygen comments and lives under **Generated Package References → C++**.As per coding guidelines, "Bold UI element references (e.g., Click Settings)."
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| Doxygen comments and lives under Generated Package References → C++. | |
| Doxygen comments and lives under **Generated Package References → C++**. |
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@sdks/cpp.mdx` at line 20, The navigation label "Generated Package References
→ C++" should be formatted as a bold UI reference; update the text string
"Doxygen comments and lives under Generated Package References → C++." so that
the UI element portion reads **Generated Package References → C++** (i.e., wrap
that navigation label in markdown bold markers) to match the guideline for
bolding UI element references.
Summary
Brings two more source repos into the same auto-doc pipeline as TypeScript/Python/.NET:
Why stubs for Rust instead of a full rustdoc dump
docs.rs already builds and hosts rustdoc for every published crate, with cross-crate linking, source-code view, and version selection. Duplicating that into Mintlify would split source-of-truth and lose features. The stub approach surfaces each crate consistently inside the docs site (with its README and metadata) and routes users to the canonical reference for the API itself.
Why Doxygen + moxygen for C++
C++ doesn't have a ubiquitous public docs host equivalent to docs.rs. Doxygen + moxygen is the established C++ → markdown path, and the headers in `vcpkg/packages/*/include/` already carry Doxygen comments (`/** @brief ... */` style).
Other changes in this PR
Rollout plan
After merge:
Test plan
Summary by CodeRabbit
Release Notes
New Features
Documentation