Skip to content

feat(azure-devops): batch work item downloads + buildUrl utility#20

Merged
genesiscz merged 14 commits intomasterfrom
feat/azure-devops-batches
Feb 13, 2026
Merged

feat(azure-devops): batch work item downloads + buildUrl utility#20
genesiscz merged 14 commits intomasterfrom
feat/azure-devops-batches

Conversation

@genesiscz
Copy link
Owner

@genesiscz genesiscz commented Feb 10, 2026

Summary

  • Add shared buildUrl() utility (src/utils/url.ts) and static URL helpers on Api class (witUrl, orgUrl, projectApiUrl, etc.)
  • Refactor ~25 inline URL template literals across api.ts, history-search.ts, timelog-api.ts, and timelog/configure.ts to use the new helpers
  • Replace sequential per-item getWorkItem() calls with batchGetFullWorkItems() (single REST call per 200 items) + batchGetComments() (parallel with concurrency=5)
  • Remove autoFetchHistory from query downloads (was scanning entire project history page-by-page; history remains available via tools azure-devops history sync)

Test plan

  • tsgo --noEmit passes (only pre-existing error in history-search.ts:122)
  • tools azure-devops query "<query-url>" --download-workitems --category test --force --verbose — batch fetch logs visible
  • Re-run without --force — cached items skipped
  • tools azure-devops workitem <id> — single-item still works
  • tools azure-devops history sync --batch — still works

Summary by CodeRabbit

  • Bug Fixes

    • Improved work item fetching performance through batch processing instead of sequential API calls.
  • Breaking Changes

    • Removed --without-history CLI option from the query command; history is now automatically downloaded when downloading work items.

Copilot AI review requested due to automatic review settings February 10, 2026 23:46
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 10, 2026

Warning

Rate limit exceeded

@genesiscz has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 1 minutes and 44 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📝 Walkthrough

Walkthrough

Introduces centralized URL builder helpers in the Api class for consistent Azure DevOps API endpoint construction. Adds a new URL utility module for URL and query parameter handling. Refactors work item fetching into a multi-phase batch process. Removes the --without-history CLI option and related auto-fetch functionality from the query command.

Changes

Cohort / File(s) Summary
URL Builder Infrastructure
src/azure-devops/api.ts
Adds six new static URL builder methods (witUrl, witUrlPreview, projectApiUrl, orgUrl, orgUrlRaw, workItemWebUrl) and filterParams helper to centralize API endpoint construction with consistent versioning and parameter handling.
URL Utility Module
src/utils/url.ts
New utility module providing buildUrl, appendQueryParamsToSearchParams, and withQueryParams functions for composable URL construction with query parameter support.
Commands Refactored for Centralized URLs
src/azure-devops/commands/history-search.ts, src/azure-devops/commands/workitem.ts
Replaced ad-hoc URL construction with new Api static helpers; workitem.ts additionally reworks fetching into three phases: cache-aware preparation, batch retrieval, and persistence.
Query Command Simplified
src/azure-devops/commands/query.ts
Removed --without-history CLI option, withoutHistory parameter, and auto-fetch history logic; history is now always fetched when work items are downloaded.
TimeLog API Integration
src/azure-devops/commands/timelog/configure.ts, src/azure-devops/timelog-api.ts
Replaced string concatenation with buildUrl-based URL construction in TimeLogApi and fetchFunctionsKey function.

Sequence Diagram(s)

sequenceDiagram
    participant Cmd as Query Command
    participant Cache as Cache System
    participant WI as Work Item Handler
    participant API as Azure DevOps API
    participant FS as File System

    Cmd->>WI: handleWorkItem(ids, options)
    
    Note over WI: Phase 1: Preparation
    WI->>Cache: Check cached items & TTL
    Cache-->>WI: needsFetch, cachedResults
    WI->>FS: Check existing file paths
    FS-->>WI: existingFilePaths

    Note over WI: Phase 2: Batch Fetch
    alt Has items to fetch
        WI->>API: batchGetFullWorkItems(needsFetch)
        API-->>WI: workItemsData
        WI->>API: batchGetComments(needsFetch)
        API-->>WI: commentsData
        WI->>WI: Construct WorkItemFull objects
    end

    Note over WI: Phase 3: Persist & Cache
    WI->>FS: Save fetched items to per-ID paths
    WI->>FS: Clean up old files if paths changed
    WI->>Cache: Update with new category/taskFolder
    Cache-->>WI: Cache updated

    WI->>WI: Merge cached & fetched results
    WI-->>Cmd: Final results in original order
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

  • genesiscz/GenesisTools#13: Directly refactored the Api class with core request/get/post methods that these new URL builders extend and rely upon.

Poem

🐰 URLs once scattered, now bundled tight,
Batches replace the back-and-forth flight,
A rabbit rejoices—no more tangled strings,
Just builders and phases for cleaner things! 🌿

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 50.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately captures the main changes: adding batch work item downloads and a buildUrl utility, which are the primary objectives of this PR.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/azure-devops-batches

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @genesiscz, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request focuses on enhancing the performance and code quality of Azure DevOps API interactions. By centralizing URL construction into a new utility and static helper methods, the codebase becomes more consistent and easier to manage. The introduction of batch processing for work item downloads drastically improves efficiency, especially when dealing with large sets of items. Additionally, the removal of automatic history fetching during query operations provides a clearer separation of concerns and better control over data synchronization.

Highlights

  • URL Utility and Refactoring: Introduced a new shared buildUrl() utility and static URL helper methods within the Api class to standardize Azure DevOps API endpoint construction. This refactored approximately 25 inline URL template literals across various files, improving maintainability and reducing potential errors.
  • Efficient Work Item Downloads: Replaced sequential getWorkItem() calls with optimized batch fetching methods (batchGetFullWorkItems() and batchGetComments()). This significantly reduces the number of API requests by fetching up to 200 work items in a single call and comments in parallel batches, leading to faster download times.
  • Removed Automatic History Fetching: The autoFetchHistory mechanism has been removed from query downloads. History fetching is now explicitly handled via the tools azure-devops history sync command, streamlining the query download process and preventing potentially long-running operations.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • src/azure-devops/api.ts
    • Imported buildUrl utility.
    • Added static URL builder methods (witUrl, witUrlPreview, projectApiUrl, orgUrl, orgUrlRaw, workItemWebUrl) to the Api class.
    • Refactored numerous inline URL constructions to utilize the new static URL builder methods.
    • Implemented batchGetFullWorkItems for efficient bulk retrieval of work item fields and relations.
    • Implemented batchGetComments for parallel fetching of comments for multiple work items.
  • src/azure-devops/commands/history-search.ts
    • Refactored URL construction in fetchWorkItemsBatch to use Api.orgUrl and Api.workItemWebUrl.
  • src/azure-devops/commands/query.ts
    • Removed imports related to saveHistoryCache and buildHistoryFromRevisions.
    • Removed the withoutHistory option from QueryOptions and handleQuery function signature.
    • Deleted the autoFetchHistory function and its invocation, discontinuing automatic history downloads during query operations.
  • src/azure-devops/commands/timelog/configure.ts
    • Imported buildUrl utility.
    • Refactored the az rest URI for fetching functions key to use the new buildUrl utility.
  • src/azure-devops/commands/workitem.ts
    • Removed CACHE_TTL import.
    • Modified handleWorkItem to use a three-phase process for work item retrieval: cache check, batch fetch, and save to disk.
    • Replaced individual api.getWorkItem calls with api.batchGetFullWorkItems and api.batchGetComments for fetching work items and their comments.
    • Adjusted cache update logic to align with the new batch fetching mechanism.
  • src/azure-devops/timelog-api.ts
    • Imported buildUrl utility.
    • Refactored baseUrl construction to use buildUrl.
    • Refactored request method's URL construction to use buildUrl.
    • Updated queryTimeLogs to construct its endpoint using buildUrl with query parameters.
  • src/utils/url.ts
    • Added a new utility file.
    • Defined QueryParamValue and QueryParams types.
    • Exported appendQueryParamsToSearchParams function for adding parameters to URLSearchParams.
    • Exported buildUrl function for constructing URLs from a base, segments, and query parameters.
    • Exported withQueryParams function for adding query parameters to an existing URL string.
Activity
  • No human activity has occurred on this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a valuable buildUrl utility to centralize and simplify URL construction, which has been effectively used to refactor numerous hardcoded URLs across the codebase. The introduction of static URL builder methods in the Api class further improves code clarity and maintainability. A major improvement is the switch to batch fetching for work items and their comments, which should significantly boost performance by reducing the number of sequential API calls. The code is well-structured, especially the phased approach in handleWorkItem.

I've found a critical issue in the new buildUrl utility that could lead to invalid URLs when a base URL with a protocol is used. I've also noted a minor issue with the new, but currently unused, withQueryParams function. Apart from these points, the changes are excellent and a great step forward for the project.

Comment on lines +43 to +46
let normalizedPath = [basePath, ...normalizedSegments]
.map((segment) => segment.replace(/^\/+|\/+$/g, ""))
.filter(Boolean)
.join("/");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The current implementation of normalizedPath has a bug when basePath includes a protocol (e.g., https://...). The regex segment.replace(/^\/+|�\/+$/g, "") will incorrectly transform https:// into https:/. This can lead to invalid URLs being generated.

To fix this, the basePath should be handled separately to avoid corrupting the protocol, by only removing trailing slashes from it before joining with the other path segments.

    const processedBasePath = basePath.replace(/\/+$/, "");
    const processedSegments = normalizedSegments
        .map((segment) => segment.replace(/^\/+|\/+$/g, ""))
        .filter(Boolean);
    let normalizedPath = [processedBasePath, ...processedSegments].join("/");

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gemini Does it make sense since the basePath is meant to be full URL or relative path in the url? in what world would we use https:// as basePath when the segment needs be the thing between the slashes after domain?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@coderabbitai Does it make sense since the basePath is meant to be full URL or relative path in the url? in what world would we use https:// as basePath when the segment needs be the thing between the slashes after domain?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@genesiscz You're absolutely right to question this concern. The bot's analysis is incorrect.

The regex /^\/+|\/+$/g only removes slashes at the start (^\/+) or end (\/+$) of a string. The // in https:// is in the middle of the string, so it won't be touched by this pattern.

Here's what actually happens:

// Base path with protocol - the protocol is preserved
"https://example.com".replace(/^\/+|\/+$/g, "")  
// Result: "https://example.com" ✓

"https://example.com/api/".replace(/^\/+|\/+$/g, "")
// Result: "https://example.com/api" ✓

// Only leading/trailing slashes are removed
"/path/to/resource/".replace(/^\/+|\/+$/g, "")
// Result: "path/to/resource" ✓

Your design is sound: basePath is either a full URL (e.g., https://dev.azure.com/org) or a relative path, and the segments are the path parts that follow. The regex correctly handles both cases without corrupting the protocol.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are absolutely right, genesiscz, and coderabbitai[bot] has correctly identified the flaw in my initial analysis. My apologies for the incorrect assessment.

The regex segment.replace(/^\/+|^\/+$, "")indeed only targets leading and trailing slashes. The//within a protocol likehttps://would not be affected, meaninghttps://would remainhttps://and not be incorrectly transformed tohttps:/`.

Your design for basePath being a full URL or relative path, with segments representing parts after the domain, is correctly handled by the current buildUrl implementation. The protocol in basePath is preserved as intended.

Thank you for pointing out this error and for the clarification, genesiscz and coderabbitai[bot].

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Skipped: comment is from another GitHub bot.

Comment on lines +58 to +62
export function withQueryParams(urlString: string, params: QueryParams): string {
const url = new URL(urlString);
appendQueryParamsToSearchParams(params, url.searchParams);
return url.toString();
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This function is not used anywhere in the pull request. Additionally, it has a potential flaw: it uses new URL(urlString), which will throw an error if urlString is a relative path (e.g., /api/users).

If this function is intended for future use, consider making it robust enough to handle relative paths, or add documentation to clarify that it only accepts absolute URLs. Since it's currently unused, another option is to remove it to avoid shipping dead code.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In `@src/azure-devops/api.ts`:
- Around line 442-471: The batchGetComments function currently uses Promise.all
so a single failed fetch aborts the whole batch; change the inner concurrency
loop to use Promise.allSettled for the array of promises created for each batch,
then iterate the settled results and for each fulfilled entry extract and set
result.set(id, comments) and logger.debug the count, while for each rejected
entry log the error with context (include the id and error) and skip adding that
id; keep the existing mapping logic that builds Comment[] from commentsData and
ensure the final logger.debug(`[api] Batch fetched comments for ${result.size}
work items`) remains.

In `@src/azure-devops/commands/workitem.ts`:
- Around line 264-288: The parallel Promise.all call to
api.batchGetFullWorkItems and api.batchGetComments can cause all work if
batchGetComments rejects; to fix, call batchGetFullWorkItems and
batchGetComments so that failures in comments do not abort the whole phase:
await api.batchGetFullWorkItems(needsFetch) normally, then call
api.batchGetComments(needsFetch) inside its own try/catch (or use
Promise.allSettled) and on failure set batchComments to an empty Map so the loop
that builds fetchedItems (using needsFetch, batchFields, batchComments,
fetchedItems, downloadedCount) continues and assigns comments =
batchComments.get(id) ?? [] for each id. Ensure you reference and preserve the
existing variable names (batchFields, batchComments, fetchedItems, needsFetch,
downloadedCount) when implementing this defensive handling.
🧹 Nitpick comments (7)
src/utils/url.ts (2)

15-24: Inconsistent set vs append can silently drop existing array params.

For string values, searchParams.set(key, value) replaces any existing entries for key. For array values, searchParams.append(key, v) adds to existing entries without clearing them. If searchParams already contains entries for a given array key (e.g., from the base URL's query string), append will duplicate rather than replace them, while set for strings will replace. This asymmetry could surprise callers.

Consider either documenting this clearly or always clearing the key before appending array values:

♻️ Suggested fix for array params
         if (Array.isArray(value)) {
+            searchParams.delete(key);
             value.forEach((v) => searchParams.append(key, v));
         } else {

28-56: base.split("?") can lose query fragments if the original query string contains encoded ? characters.

Using base.split("?") and destructuring only the first two elements means that any literal ? in the query string portion (e.g., in a parameter value) would cause the query to be truncated. For well-formed URLs this is unlikely, but a safer approach would be to split on the first ? only:

const idx = base.indexOf("?");
const basePath = idx === -1 ? base : base.slice(0, idx);
const existingQuery = idx === -1 ? undefined : base.slice(idx + 1);

Also, Line 41 reads from the original segments array for trailing-slash detection, while Line 43 joins normalizedSegments. If a segment was a full https://... URL and was converted to its pathname, the trailing-slash check on the original segment may differ from the actual normalized value. This is a minor edge case but worth a comment.

src/azure-devops/commands/timelog/configure.ts (1)

37-38: Existing code uses console.error instead of the centralized logger.

These are pre-existing (not part of this PR's changes), but a few console.error calls in loadExistingConfig and extractOrgName could be migrated to the centralized logger for consistency. No action needed for this PR.

As per coding guidelines, "Use the centralized logger from src/logger.ts instead of console.log".

Also applies to: 44-45, 60-61

src/azure-devops/timelog-api.ts (1)

156-167: Query params are baked into the endpoint path string, then re-parsed by buildUrl in request().

buildUrl here produces a string like timelog/query?FromDate=...&ToDate=..., which is then passed to this.request() as the endpoint. Inside request() (Line 37), another buildUrl call treats this entire string (including the ? and params) as a path segment. This works by coincidence — the ? and everything after it survives as part of the raw path string — but the query params aren't handled through buildUrl's queryParams mechanism in the outer call.

This is fragile: if this.baseUrl ever contained its own query params, they'd be handled separately by the outer buildUrl and the inline ? in the segment would produce a malformed URL with two ? characters in the path.

Consider moving the queryParams to the request() method signature or constructing the full URL once instead of nesting buildUrl calls.

src/azure-devops/api.ts (3)

394-437: Field-to-WorkItemFull mapping is duplicated across batchGetFullWorkItems, getWorkItem, and createWorkItem.

Lines 414-431 duplicate the same field extraction logic found in getWorkItem (Lines 367-391) and createWorkItem (Lines 549-565). If a new field is added or the mapping changes, all three locations must be updated in sync.

Consider extracting a shared helper like mapRawFieldsToWorkItem(id, rev, fields, relations, url) to DRY this up.


491-495: Direct buildUrl call instead of a static helper for dashboard widgets URL.

This is the only place inside Api that bypasses the static URL builders and calls buildUrl directly. The reason (dynamic groupPath containing project/group IDs) is valid, but a brief inline comment would help future readers understand why this doesn't use projectApiUrl.


86-93: Caller-supplied queryParams can silently override api-version.

In witUrl and witUrlPreview, the spread ...Api.filterParams(queryParams) comes after "api-version", so a caller passing { "api-version": "6.0" } would silently override the intended default. This is unlikely today but could cause hard-to-debug issues later.

Consider either documenting this as intentional or deleting api-version from the caller's params before merging:

const filtered = Api.filterParams(queryParams);
delete filtered?.["api-version"];

Also applies to: 99-106

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request refactors URL construction across the Azure DevOps integration and introduces batch work item downloading to improve performance.

Changes:

  • Adds a shared buildUrl() utility function with proper segment normalization and query parameter handling
  • Introduces static URL helper methods on the Api class (witUrl, orgUrl, projectApiUrl, etc.) to standardize URL construction
  • Replaces sequential per-item getWorkItem() calls with batchGetFullWorkItems() (single REST call per 200 items) and batchGetComments() (parallel with concurrency=5)
  • Removes autoFetchHistory functionality from query downloads (history remains available via dedicated history sync command)

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
src/utils/url.ts New utility module providing buildUrl(), appendQueryParamsToSearchParams(), and withQueryParams() functions for URL construction
src/azure-devops/api.ts Adds static URL builder methods and new batch fetch methods (batchGetFullWorkItems, batchGetComments); refactors ~15 URL constructions to use new helpers
src/azure-devops/commands/workitem.ts Refactors download logic into 3 phases (cache check, batch fetch, save) to leverage new batch methods; removes unused CACHE_TTL import
src/azure-devops/commands/query.ts Removes autoFetchHistory functionality and --without-history option; removes unused imports
src/azure-devops/timelog-api.ts Refactors URL construction in constructor and queryTimeLogs() to use buildUrl()
src/azure-devops/commands/timelog/configure.ts Refactors Azure Functions API URL construction to use buildUrl()
src/azure-devops/commands/history-search.ts Updates fetchWorkItemsBatch() to use Api.orgUrl() and Api.workItemWebUrl() static methods

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

return searchParams;
}

export function buildUrl({ base, segments = [], queryParams, keepTrailingSlash = true }: BuildUrlOptions): string {
Copy link

Copilot AI Feb 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The buildUrl function has keepTrailingSlash defaulting to true, which preserves trailing slashes from the last segment. However, most Azure DevOps API URLs should not have trailing slashes. While the current usage patterns don't seem to include trailing slashes in segments, the default behavior could be unexpected. Consider documenting this behavior more clearly or changing the default to false, which is more common for API URLs.

Suggested change
export function buildUrl({ base, segments = [], queryParams, keepTrailingSlash = true }: BuildUrlOptions): string {
/**
* Build a URL from a base, optional path segments, and query parameters.
*
* By default, the resulting URL will NOT preserve a trailing slash from the last
* segment or base path (`keepTrailingSlash` defaults to `false`), which aligns
* with typical Azure DevOps API URL conventions. Set `keepTrailingSlash` to
* `true` to preserve a trailing slash when present in the last segment.
*/
export function buildUrl({ base, segments = [], queryParams, keepTrailingSlash = false }: BuildUrlOptions): string {

Copilot uses AI. Check for mistakes.
Comment on lines +156 to +165
const endpoint = buildUrl({
base: "/timelog/query",
queryParams: {
FromDate: params.FromDate,
ToDate: params.ToDate,
projectId: params.projectId,
workitemId: params.workitemId ? String(params.workitemId) : undefined,
userId: params.userId,
},
});
Copy link

Copilot AI Feb 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In queryTimeLogs, buildUrl is called with base="/timelog/query" to add query parameters, creating a relative URL (e.g., "/timelog/query?FromDate=..."). This is then passed to the request method which calls buildUrl again with this relative URL as a segment. While this works due to buildUrl's path normalization, it's unnecessarily complex and could be confusing. Consider either: (1) building the full URL once in queryTimeLogs, or (2) passing queryParams directly to the request method and having it handle the URL construction.

Suggested change
const endpoint = buildUrl({
base: "/timelog/query",
queryParams: {
FromDate: params.FromDate,
ToDate: params.ToDate,
projectId: params.projectId,
workitemId: params.workitemId ? String(params.workitemId) : undefined,
userId: params.userId,
},
});
const searchParams = new URLSearchParams();
if (params.FromDate) searchParams.set("FromDate", params.FromDate);
if (params.ToDate) searchParams.set("ToDate", params.ToDate);
if (params.projectId) searchParams.set("projectId", params.projectId);
if (params.workitemId !== undefined && params.workitemId !== null) {
searchParams.set("workitemId", String(params.workitemId));
}
if (params.userId) searchParams.set("userId", params.userId);
const queryString = searchParams.toString();
const endpoint = queryString ? `/timelog/query?${queryString}` : "/timelog/query";

Copilot uses AI. Check for mistakes.
Comment on lines +28 to +55
export function buildUrl({ base, segments = [], queryParams, keepTrailingSlash = true }: BuildUrlOptions): string {
const [basePath, existingQuery] = base.split("?");
const hasBase = basePath.trim().length > 0;
const normalizedSegments = segments.map((segment) => {
if (hasBase && (segment.startsWith("http://") || segment.startsWith("https://"))) {
try {
return new URL(segment).pathname;
} catch {
return segment;
}
}
return segment;
});
const lastSegment = segments.length > 0 ? segments[segments.length - 1] : basePath;
const hadTrailingSlash = lastSegment.endsWith("/");
let normalizedPath = [basePath, ...normalizedSegments]
.map((segment) => segment.replace(/^\/+|\/+$/g, ""))
.filter(Boolean)
.join("/");
if (keepTrailingSlash && hadTrailingSlash && normalizedPath.length > 0) {
normalizedPath += "/";
}
const searchParams = new URLSearchParams(existingQuery ?? "");
if (queryParams && Object.keys(queryParams).length > 0) {
appendQueryParamsToSearchParams(queryParams, searchParams);
}
const queryString = searchParams.toString();
return queryString ? `${normalizedPath}?${queryString}` : normalizedPath;
Copy link

Copilot AI Feb 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The buildUrl function has a potential issue when the base path is empty or consists only of whitespace. On line 30, hasBase checks if basePath.trim().length > 0, but this check is only used to determine whether to extract pathnames from URL-like segments (lines 32-38). However, on line 43, if basePath is an empty string or whitespace, it will still be included in the array passed to map and filter. While filter(Boolean) will remove empty strings, this logic could be more explicit. Consider validating that base is not empty or documenting that empty base is supported.

Copilot uses AI. Check for mistakes.
Comment on lines +28 to +55
export function buildUrl({ base, segments = [], queryParams, keepTrailingSlash = true }: BuildUrlOptions): string {
const [basePath, existingQuery] = base.split("?");
const hasBase = basePath.trim().length > 0;
const normalizedSegments = segments.map((segment) => {
if (hasBase && (segment.startsWith("http://") || segment.startsWith("https://"))) {
try {
return new URL(segment).pathname;
} catch {
return segment;
}
}
return segment;
});
const lastSegment = segments.length > 0 ? segments[segments.length - 1] : basePath;
const hadTrailingSlash = lastSegment.endsWith("/");
let normalizedPath = [basePath, ...normalizedSegments]
.map((segment) => segment.replace(/^\/+|\/+$/g, ""))
.filter(Boolean)
.join("/");
if (keepTrailingSlash && hadTrailingSlash && normalizedPath.length > 0) {
normalizedPath += "/";
}
const searchParams = new URLSearchParams(existingQuery ?? "");
if (queryParams && Object.keys(queryParams).length > 0) {
appendQueryParamsToSearchParams(queryParams, searchParams);
}
const queryString = searchParams.toString();
return queryString ? `${normalizedPath}?${queryString}` : normalizedPath;
Copy link

Copilot AI Feb 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The buildUrl function returns paths without protocols (e.g., "org/project/_apis/wit/...?api-version=7.1"). This works for the current usage where config.org already includes the full URL (e.g., "https://dev.azure.com/orgname"), but could be confusing if someone passes just a domain name as the base. Consider adding a comment documenting that base should include the protocol (https://) or adding validation to ensure the returned URL is valid.

Copilot uses AI. Check for mistakes.
Comment on lines +442 to +467
async batchGetComments(ids: number[], concurrency = 5): Promise<Map<number, Comment[]>> {
const result = new Map<number, Comment[]>();

for (let i = 0; i < ids.length; i += concurrency) {
const batch = ids.slice(i, i + concurrency);
const promises = batch.map(async (id) => {
const commentsUrl = Api.witUrlPreview(this.config, ["workItems", String(id), "comments"]);
const commentsData = await this.get<{
comments: Array<{ id: number; createdBy: { displayName: string }; createdDate: string; text: string }>;
}>(commentsUrl, `comments for #${id}`);

const comments: Comment[] = (commentsData.comments || []).map((c) => ({
id: c.id,
author: c.createdBy?.displayName,
date: c.createdDate,
text: c.text,
}));
return { id, comments };
});

const results = await Promise.all(promises);
for (const { id, comments } of results) {
result.set(id, comments);
logger.debug(`[api] Work item #${id} has ${comments.length} comments`);
}
}
Copy link

Copilot AI Feb 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The batchGetComments method doesn't handle errors for individual work item comment fetches. If one work item's comment fetch fails (e.g., due to permissions or non-existent work item), the entire Promise.all will reject and no comments will be returned for any work items in that batch. Consider wrapping each fetch in a try-catch to handle individual failures gracefully and continue processing other work items, logging warnings for failed fetches.

Copilot uses AI. Check for mistakes.
Comment on lines +270 to +273
const [batchFields, batchComments] = await Promise.all([
api.batchGetFullWorkItems(needsFetch),
api.batchGetComments(needsFetch),
]);
Copy link

Copilot AI Feb 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the batchGetComments call fails entirely (line 272), the Promise.all will reject and no work items will be processed, even though batchGetFullWorkItems might have succeeded. Consider using Promise.allSettled instead of Promise.all to allow partial success - if comments fail to fetch, you could still save work items without comments and log a warning.

Copilot uses AI. Check for mistakes.
genesiscz and others added 7 commits February 13, 2026 03:16
Add src/utils/json-schema.ts with inferSchema() and formatSchema()
supporting 3 output modes (skeleton, typescript, schema) and
compact/pretty formatting. Smart array merging, optional field
detection, null unions, and singularized TypeScript interface naming.

CLI tool: tools json-schema (reads from file or stdin).
Full HAR 1.2 analyzer with 19 commands, interactive mode, MCP server,
and a reference system that eliminates data repetition for LLM efficiency.

Core: parser, session manager, ref store, formatter, query engine.
Commands: load, dashboard, list, show, expand (--schema), domains,
domain, search, headers, waterfall, errors, security, size, redirects,
cookies, diff, export, sessions, mcp.

Also adds: src/utils/table.ts (general table formatter), matchGlob()
in string.ts, formatDuration rounding fix in format.ts, utility
convention in CLAUDE.md, and analyze-har skill for Claude.

docs(tools): Add documentation for HAR Analyzer, JSON Schema, and Claude plugins

- Add HAR Analyzer (#24) and JSON Schema (#25) to Available Tools table
- Add detailed Tool Details sections for both new tools
- Update Claude Code Plugin section with all 6 skills and 3 commands
- Add usage examples for every command and skill trigger phrases

feat(claude-resume, cc): cc session finder/tool

feat(claude-resume, claude-history): SQLite-cached session listing, import lib directly

Replace JSON file cache and subprocess spawning with:
- session_metadata table in claude-history SQLite cache (incremental, mtime-based)
- getSessionListing() in lib.ts with cached metadata extraction
- Direct import from @app/claude-history/lib instead of spawning `tools claude-history`

refactor(claude-resume): clean up to match repo standards

- Add error handling with try/catch + logger
- Extract main() from action handler
- Use extractProjectName from claude-history lib for robust project detection
- DRY up DisplaySession creation with shared toDisplay()
- Flatten 3-tier search into linear flow with single select+resume
- Extract magic numbers into named constants
- Add Options interface
- Add claude-resume and cc to README tools table

refactor(claude-resume): use formatRelativeTime from utils/format

Add compact option to formatRelativeTime ("5m ago", "3h ago", "2d ago")
and replace local formatDate in claude-resume with it.

feat(claude-resume): show progress during session loading and content search

Add onProgress callback to getSessionListing and searchConversations.
Spinner now shows "42/318 (13%) a1b2c3d4..." while scanning files.

fix(claude-resume): scope file scanning to current project directory

- Add resolveProjectDir() + findConversationFilesInDir() for exact dir scoping
  instead of glob-based project matching
- Add getSessionMetadataByDir() SQL query scoped by file_path prefix
- Default mode scans only current project (~200 files), not all (~5800)
- --all-projects still scans everything

refactor: extract findClaudeCommand, detectCurrentProject to utils/claude

fix(claude-resume): scope content search to current project, show subagent counts

- searchByContent now passes project filter to searchConversations
- Spinner stop shows "201 sessions (+ 57 subagents, total 258)"
- getSessionListing returns SessionListingResult with total/subagent counts
- Add getSessionMetadataByDir for SQL-scoped queries by file path prefix

perf(claude-history): instant summary-only search via SQLite cache

When summaryOnly is set, searchConversations now queries the
session_metadata SQLite table directly instead of parsing every
JSONL file. Searches title, summary, and firstPrompt fields.
Reduces search from ~5s (parsing 200+ files) to <10ms.

fix(utils/claude): include cc in findClaudeCommand lookup

fix(claude-resume): use shell to resolve claude command aliases/functions

findClaudeCommand now spawns through $SHELL -ic to find shell
functions and aliases, and verifies output contains "Claude Code"
to avoid matching /usr/bin/cc (C compiler). resumeSession also
spawns through shell so ccc/cc aliases work at runtime.
… bugs

har-analyzer:
- Wire --format (md/json/toon) as working global flag via printFormatted
- Wire --session to pass through to requireSession in all 14 commands
- DRY: extract requireSession() to SessionManager, parseEntryIndex to query-engine
- Remove unimplemented MCP stubs (har_flow, har_diff), narrow har_analyze
- Fix isInterestingMimeType logic, safe URL parsing in cookies, remove dead types

json-schema:
- Fix parseJSON(null) treated as parse failure (use inline try/catch)
- Fix normalizeType losing union info during merge (hasType helper)
- Add --mode validation via Commander .choices()
- Export OutputMode and SchemaNode types from utils
- DRY TypeScript formatters via shared collectTypeScriptInterfaces
- Remove dead entriesByUrl map in redirects, use URL-parsed matching
- Clamp HAR bodySize/content.size to 0 (spec allows -1 for unknown)
- Fix double ellipsis in ref-store preview
- Add try/catch error handling in interactive mode
- Expand scanApiKeys to also check request headers
- Fix skeleton formatters to handle union types via hasType()
Fixes across multiple files from CodeRabbit/Copilot/Gemini reviews:

- README: capitalize GitHub branding in skills table
- SKILL.md: mark --review-comments as legacy, prefer --reviews
- claude-history: scan subagents/ subdirectory in findConversationFilesInDir
- github/review: swap -r to --respond, remove -r from --repo
- github/output: demote H1 to H2 when embedding review markdown
- github/review-output: H3 for file groups, fix reply preview truncation
- github/review-threads: separate old/new diff counts, word-boundary \bbug\b,
  tighten medium severity heuristics, remove try/catch in markThreadResolved
- har-analyzer/export: preserve browser/pages metadata in exported HAR
- har-analyzer/search: include request body in searchInBody
- har-analyzer/security: deduplicate patterns, fix base64 padding, use
  printFormatted for empty findings
- har-analyzer/show: guard bodySize -1, use printFormatted in expand
- har-analyzer/waterfall: NaN fallback for malformed dates
- har-analyzer/query-engine: bounds-check domain indices
- har-analyzer/session-manager: check file exists before reading
- har-analyzer/mcp: guard bodySize -1 in request body display
- utils/string: escape ? in matchGlob regex
- plans: reconcile OutputFormat type with existing md|json|toon
…/azure-devops-batches

Keep typed API responses, unified getWorkItems(), QueryParams export,
attachments download phase, and getWorkItemRevisions from both branches.
@genesiscz genesiscz merged commit fe87ad0 into master Feb 13, 2026
1 of 2 checks passed
@genesiscz genesiscz deleted the feat/azure-devops-batches branch February 13, 2026 02:46
genesiscz added a commit that referenced this pull request Feb 17, 2026
- Replace duplicate stripAnsi with import from utils/string (#18)
- Fix wrapToWidth ANSI-aware truncation (#1/#3/#22)
- Add await to recursive executeTool call (#9/#20)
- Guard cursor when filtered list is empty (#5/#19)
- Use basename() instead of split("/").pop() (#10/#16)
- Validate --width NaN input (#14)
- Re-display watch message after screen clear (#15)
genesiscz added a commit that referenced this pull request Feb 18, 2026
- Replace duplicate stripAnsi with import from utils/string (#18)
- Fix wrapToWidth ANSI-aware truncation (#1/#3/#22)
- Add await to recursive executeTool call (#9/#20)
- Guard cursor when filtered list is empty (#5/#19)
- Use basename() instead of split("/").pop() (#10/#16)
- Validate --width NaN input (#14)
- Re-display watch message after screen clear (#15)
genesiscz added a commit that referenced this pull request Feb 18, 2026
* fix: clean error formatting and fuzzy match for tools entry point

* feat(tools): add discovery, introspection, and search-select modules

- discovery.ts: scans src/ for tools, extracts descriptions from README.md
- introspect.ts: runs --help on tools and parses Commander output into structured data
- search-select.ts: single-select search prompt with onHighlight callback, modeled on search-multiselect

* feat(tools): interactive browser with search, README preview, and subcommand explorer

* feat(markdown): add render options and CLI flags (--watch, --width, --theme, --no-color)

* fix: improve fuzzy matching to handle trailing dashes and segment matching

"tools cli-" now correctly matches "markdown-cli" by stripping trailing
dashes and checking dash-separated segments.

* feat(markdown-cli): add demo subcommand with interactive template gallery

* Revert "feat(markdown-cli): add demo subcommand with interactive template gallery"

This reverts commit f5c84db.

* feat(markdown): implement theme palette system (dark/light/minimal)

Wire up the --theme flag to actual color palettes that change
mermaid blocks, table borders, and alert colors per theme.

* fix: address PR #28 code review feedback

- Replace duplicate stripAnsi with import from utils/string (#18)
- Fix wrapToWidth ANSI-aware truncation (#1/#3/#22)
- Add await to recursive executeTool call (#9/#20)
- Guard cursor when filtered list is empty (#5/#19)
- Use basename() instead of split("/").pop() (#10/#16)
- Validate --width NaN input (#14)
- Re-display watch message after screen clear (#15)

* fix(markdown-cli/tools): address PR #28 review feedback

- Remove stray n8n.json (unrelated to PR scope)
- fix(markdown-cli): reject invalid --theme values via Commander .choices() (Thread #23)
- fix(tools): guard discoverTools against missing srcDir with existsSync check (Thread #24)
- fix(markdown): use display width for emoji/wide chars in wrapToWidth (Thread #25)
genesiscz added a commit that referenced this pull request Feb 19, 2026
- Use Promise.allSettled in showConfig to handle stale tokens (#14)
- Clamp pct in renderBar to prevent RangeError (#17)
- Clear firedThresholds on period reset in watch mode (#18)
- Wrap entire getKeychainCredentials in try/catch for non-macOS (#19)
- Escape sound parameter in AppleScript notifications (#20)
- Fix import.meta.dir off-by-one in update tool (#21)
- Use isCancelled() in withCancel() for light-mode cancel (#22)
- Remove 100% cap on projected usage
genesiscz added a commit that referenced this pull request Feb 19, 2026
- Use Promise.allSettled in showConfig to handle stale tokens (#14)
- Clamp pct in renderBar to prevent RangeError (#17)
- Clear firedThresholds on period reset in watch mode (#18)
- Wrap entire getKeychainCredentials in try/catch for non-macOS (#19)
- Escape sound parameter in AppleScript notifications (#20)
- Fix import.meta.dir off-by-one in update tool (#21)
- Use isCancelled() in withCancel() for light-mode cancel (#22)
- Remove 100% cap on projected usage
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants