Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
- name: Install pnpm
uses: pnpm/action-setup@v4
with:
version: 8
version: 9

- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v4
Expand Down
52 changes: 46 additions & 6 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,21 +13,61 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## [2.2.3] - 2026-03-29

### Changed
### Added

- No changes yet.
- **Pipeline** (`bytekit/pipeline`): Typed functional data-pipeline builder with lazy evaluation.
- `Pipeline<TIn, TOut>` class — immutable `.pipe(op)` builder, lazy `.process(input)` executor.
- `pipe()` factory with 7 typed overloads (1-op to 7-op) plus variadic escape hatch.
- `map()`, `filter()`, `reduce()` operator factories with full JSDoc and `@example` blocks.
- `map` / `filter`: concurrent via `Promise.all` (order preserved).
- `reduce`: sequential for deterministic accumulation.
- New `bytekit/pipeline` package export entry.
- `ApiClient.RequestOptions` extended with optional `pipeline` field (non-breaking).
- 20 new tests — 100% statement/branch/function/line coverage on `pipeline.ts`.

## [2.2.2] - 2026-03-29

### Changed
### Added

- No changes yet.
- **WebSocketHelper** — advanced reconnection and validation features:
- Configurable back-off strategies: `"linear"`, `"exponential"`, or a custom `(attempt) => number` function.
- Full Jitter option for exponential back-off (`jitter: true`).
- `maxReconnectDelayMs` cap for exponential delays.
- Per-message-type schema validation via `schemas: Record<string, SchemaAdapter>` — invalid messages are dropped and `onValidationError` fires.
- Pong / heartbeat timeout detection: forces reconnect if no message arrives within `heartbeatTimeoutMs` after a ping.
- New event handlers: `onReconnect()`, `onMaxRetriesReached()`, `onValidationError()`.
- 15 new tests (30 total for WebSocketHelper).
- **RequestQueue** (`bytekit/async`): Priority-aware concurrency-limited task queue.
- Three fixed priority lanes: `high > normal > low`.
- `AbortSignal`-based task cancellation.
- `onError` callback for per-task error isolation.
- `size`, `running`, `pending` observable state getters.
- `flush()` waiter that resolves when all queued tasks settle.
- **RequestBatcher** (`bytekit/async`): Time-window HTTP request deduplication.
- Fixed and sliding window modes.
- `maxSize` early-flush trigger.
- Custom `keyFn` override for request identity.
- Shared result delivery to all same-key callers within a window.
- `ApiClient` extended with `queue?: RequestQueueOptions` and `batch?: BatchOptions` (non-breaking; legacy `pool` option continues to work).
- 47 new tests (request-queue, request-batcher, ApiClient integration).

## [2.2.1] - 2026-03-28

### Changed
### Added

- No changes yet.
- **FileUploadHelper** — resumable and concurrent chunked uploads:
- `resumeFrom?: number` option: 0-based chunk index to start from, skipping all prior chunks. Pass `uploadedChunks` from a previous failed response to resume without re-uploading.
- `concurrency?: number` option: upload up to N chunks in parallel (windowed `Promise.all` batching). Defaults to `1` (sequential, fully backward-compatible).
- `UploadResponse.uploadedChunks`: absolute count of chunks successfully sent; safe to use as `resumeFrom` on retry.
- `UploadResponse.totalChunks`: total chunk count for the file at the given `chunkSize`.
- Edge-case clamping: `chunkSize ≤ 0` → 5 MB default; `concurrency < 1` → 1; `resumeFrom < 0` → 0; `resumeFrom ≥ totalChunks` → immediate success with zero fetch calls.
- `onProgress` baseline pre-initialized from skipped chunks so `percentage` reflects total-file progress when resuming.
- 12 new tests (21 total for FileUploadHelper).
- **PromisePool** (`bytekit/async`): Concurrency-limited async task pool with configurable limits and timeout handling.

### Fixed

- **ApiClient**: `post()`, `put()`, and `patch()` methods now accept `RequestBody` type for `bodyOrOptions`, resolving a TypeScript type narrowing regression.

## [2.1.3] - 2026-03-20

Expand Down
75 changes: 75 additions & 0 deletions specs/008-fix-ci-changelog/spec.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
# Feature Specification: Fix CI format:check failures and generate CHANGELOG v2.2.0→v2.2.3

**Feature Branch**: `008-fix-ci-changelog`
**Created**: 2026-03-29
**Status**: Draft
**Input**: Fix CI workflow failures (format:check) and write CHANGELOG entries from v2.2.0 to v2.2.3

## User Scenarios & Testing *(mandatory)*

### User Story 1 - Fix CI green on every PR (Priority: P1)

As a developer, I want every PR to pass CI without needing to manually run prettier
before pushing, so I stop losing time investigating "red CI" that has nothing to do
with my actual code change.

**Why this priority**: CI is failing on every PR due to format:check reporting 19 files
with prettier violations. Nothing else in CI is broken (typecheck passes, lint is warnings
only, tests pass). This blocks contributors and wastes review time.

**Independent Test**: Run `pnpm run format:check` locally — must exit 0 with no warnings.
Run `pnpm run lint` — must exit 0. Run `pnpm test` — must pass. All three pass in CI.

**Acceptance Scenarios**:

1. **Given** the current codebase, **When** `pnpm run format:check` runs, **Then** exits 0
with the message "All matched files use Prettier code style!"
2. **Given** a developer pushes a branch, **When** CI runs, **Then** the `Lint`,
`Format check`, `Type check`, `Build`, and `Test` steps all pass (green).

---

### User Story 2 - Readable CHANGELOG for v2.2.1, v2.2.2, v2.2.3 (Priority: P2)

As a library consumer or contributor, I want to read what changed in each release,
so I can decide whether to upgrade and understand what new APIs are available.

**Why this priority**: CHANGELOG.md currently has "No changes yet." for v2.2.1–v2.2.3
even though three significant features shipped. This makes it impossible to audit
what changed without reading raw git log.

**Independent Test**: Open CHANGELOG.md — each of v2.2.1, v2.2.2, v2.2.3 must have
at least one `### Added` or `### Changed` section with a non-placeholder entry.

**Acceptance Scenarios**:

1. **Given** CHANGELOG.md, **When** reading the v2.2.3 section, **Then** it documents
the Pipeline feature (pipe(), map(), filter(), reduce(), Pipeline class, bytekit/pipeline export).
2. **Given** CHANGELOG.md, **When** reading the v2.2.2 section, **Then** it documents
WebSocket advanced features (backoff strategies, schema validation, pong detection)
and RequestQueue/RequestBatcher (batching system, priority lanes, concurrency).
3. **Given** CHANGELOG.md, **When** reading the v2.2.1 section, **Then** it documents
FileUploadHelper resume support (resumeFrom, concurrency, uploadedChunks/totalChunks)
and the ApiClient RequestBody type fix.

---

## Technical Requirements

### CI Fix

- Run `pnpm run format` to auto-fix all 19 files with prettier violations
- Verify `pnpm run format:check` exits 0 after fix
- Note: ci.yml `build` job uses pnpm v8 throughout; `coverage` and `security` jobs
use v9. This version mismatch is a secondary CI risk — standardize to v9.

### CHANGELOG

- Format: Keep a Changelog (https://keepachangelog.com/en/1.0.0/)
- Sections per release: Added / Changed / Fixed (only non-empty sections)
- Content sourced from PR merge commit bodies (#15–#18)

## Success Metrics

- `pnpm run format:check` exits 0 locally and in CI
- CHANGELOG entries are accurate and non-empty for v2.2.1–v2.2.3
21 changes: 11 additions & 10 deletions src/cli/ddd-boilerplate.ts
Original file line number Diff line number Diff line change
Expand Up @@ -93,10 +93,7 @@ function inferHttpVerb(action: string): HttpVerb {
return "GET";
}

function resolveActionConfig(
action: string,
pascal: string
): ActionConfig {
function resolveActionConfig(action: string, pascal: string): ActionConfig {
const verb = inferHttpVerb(action);
const isList = /all|many|list|search/.test(action.toLowerCase());

Expand Down Expand Up @@ -274,7 +271,10 @@ function generateHttpRepoSource(
const needsProps = configs.some((c) =>
c.params.some((p) => p.type.includes("Props"))
);
const entityImportParts = [`${pascal}Entity`, needsProps ? `${pascal}EntityProps` : ""]
const entityImportParts = [
`${pascal}Entity`,
needsProps ? `${pascal}EntityProps` : "",
]
.filter(Boolean)
.join(", ");

Expand Down Expand Up @@ -343,10 +343,7 @@ export async function generateDddBoilerplate(
);
}

const rootDir = path.resolve(
process.cwd(),
options.outDir?.trim() || slug
);
const rootDir = path.resolve(process.cwd(), options.outDir?.trim() || slug);

const contextPascal = pascalFromKebabSlug(slug);
const outboundSlug = slugifyDomain(portLabel);
Expand Down Expand Up @@ -434,7 +431,11 @@ export interface ${outboundPascal} {
"entities",
`${slug}.entity.ts`
);
await fs.writeFile(entityFile, generateEntitySource(slug, contextPascal), "utf8");
await fs.writeFile(
entityFile,
generateEntitySource(slug, contextPascal),
"utf8"
);

// 2. Repository interface
const repoInterfaceFile = path.join(
Expand Down
5 changes: 4 additions & 1 deletion src/cli/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,10 @@ function parseDddArgs(argv: string[]): DddCliArgs | null {
outDir = arg.slice("--out=".length);
} else if (arg.startsWith("--actions=")) {
const raw = arg.slice("--actions=".length);
actions = raw.split(",").map((a) => a.trim()).filter(Boolean);
actions = raw
.split(",")
.map((a) => a.trim())
.filter(Boolean);
}
}

Expand Down
5 changes: 1 addition & 4 deletions src/utils/async/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -37,10 +37,7 @@ export { debounceAsync } from "./debounce.js";
export { throttleAsync } from "./throttle.js";
export { PromisePool, PoolTimeoutError } from "./promise-pool.js";
export type { PromisePoolOptions } from "./promise-pool.js";
export {
RequestQueue,
QueueAbortError,
} from "./request-queue.js";
export { RequestQueue, QueueAbortError } from "./request-queue.js";
export type {
QueuePriority,
RequestQueueOptions,
Expand Down
4 changes: 1 addition & 3 deletions src/utils/async/pipeline.ts
Original file line number Diff line number Diff line change
Expand Up @@ -101,9 +101,7 @@ export class Pipeline<TIn, TOut> {
* );
* // Inferred: Pipeline<number[], number>
*/
export function pipe<T, A>(
op1: PipelineOp<T, A>
): Pipeline<T, A>;
export function pipe<T, A>(op1: PipelineOp<T, A>): Pipeline<T, A>;

export function pipe<T, A, B>(
op1: PipelineOp<T, A>,
Expand Down
16 changes: 13 additions & 3 deletions src/utils/async/promise-pool.ts
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,9 @@ export class PromisePool {
}

const results: T[] = new Array(tasks.length);
await Promise.all(tasks.map((task, index) => this.addTask(task, index, results)));
await Promise.all(
tasks.map((task, index) => this.addTask(task, index, results))
);
return results;
}

Expand All @@ -105,15 +107,23 @@ export class PromisePool {
}

private processQueue(results: unknown[]): void {
while (this.running < this.options.concurrency && this.queue.length > 0) {
while (
this.running < this.options.concurrency &&
this.queue.length > 0
) {
const item = this.queue.shift()!;
this.running++;
this.executeTask(item, results);
}
}

private async executeTask(
item: { task: () => Promise<unknown>; resolve: (v: unknown) => void; reject: (r: unknown) => void; index: number },
item: {
task: () => Promise<unknown>;
resolve: (v: unknown) => void;
reject: (r: unknown) => void;
index: number;
},
results: unknown[]
): Promise<void> {
try {
Expand Down
30 changes: 20 additions & 10 deletions src/utils/core/ApiClient.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@

import { Logger } from "#core/Logger.js";
import { UrlHelper } from "#helpers/UrlHelper.js";
import { retry as retryFn } from "../async/retry.js";
Expand Down Expand Up @@ -330,23 +329,32 @@ export class ApiClient {
* headers: { "X-Custom": "value" }
* })
*/
async post<T>(path: string | URL, bodyOrOptions?: RequestOptions<T> | RequestBody) {
async post<T>(
path: string | URL,
bodyOrOptions?: RequestOptions<T> | RequestBody
) {
const options = this.normalizeBodyOrOptions<T>(bodyOrOptions);
return this.request<T>(path, { ...options, method: "POST" });
}

/**
* PUT request - Acepta body directamente o RequestOptions
*/
async put<T>(path: string | URL, bodyOrOptions?: RequestOptions<T> | RequestBody) {
async put<T>(
path: string | URL,
bodyOrOptions?: RequestOptions<T> | RequestBody
) {
const options = this.normalizeBodyOrOptions<T>(bodyOrOptions);
return this.request<T>(path, { ...options, method: "PUT" });
}

/**
* PATCH request - Acepta body directamente o RequestOptions
*/
async patch<T>(path: string | URL, bodyOrOptions?: RequestOptions<T> | RequestBody) {
async patch<T>(
path: string | URL,
bodyOrOptions?: RequestOptions<T> | RequestBody
) {
const options = this.normalizeBodyOrOptions<T>(bodyOrOptions);
return this.request<T>(path, { ...options, method: "PATCH" });
}
Expand Down Expand Up @@ -571,24 +579,26 @@ export class ApiClient {
): Promise<T> {
// US4: RequestQueue — concurrency-limited, priority-aware queue
if (this._queue) {
return this._queue.add<T>((_signal) => this.executeRequest(path, options));
return this._queue.add<T>((_signal) =>
this.executeRequest(path, options)
);
}

// US4: RequestBatcher — time-window deduplication
if (this._batcher) {
const pathStr = String(path);
const method = (options.method ?? "GET").toUpperCase();
const proxyInit: RequestInit = { method };
return this._batcher.add<T>(
pathStr,
proxyInit,
(_url, _init) => this.executeRequest(path, options)
return this._batcher.add<T>(pathStr, proxyInit, (_url, _init) =>
this.executeRequest(path, options)
);
}

// T026: legacy pool support (003) — if a pool is configured, route through it
if (this.pool) {
const results = await this.pool.run<T>([() => this.executeRequest(path, options)]);
const results = await this.pool.run<T>([
() => this.executeRequest(path, options),
]);
return results[0];
}
return this.executeRequest(path, options);
Expand Down
Loading
Loading