Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .claude/settings.local.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"permissions": {
"allow": ["Bash(tree:*)", "Bash(pnpm run:*)"],
"deny": [],
"ask": []
}
}
103 changes: 103 additions & 0 deletions CLAUDE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
# CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with
code in this repository.

## Commands

### Development Commands

- `pnpm lint` - Run ESLint on the source code
- `pnpm check-types` - Run TypeScript type checking without emitting files
- `pnpm check-format` - Check code formatting with Prettier
- `pnpm format` - Auto-format code with Prettier
- `pnpm build` - Build the library (runs tsup-node, tsc for declarations, and
tsc-alias)
- `pnpm clean` - Remove dist directory and build artifacts
- `pnpm prepare` - Full build cycle (clean, check-types, and build)

### Important Notes

- No test runner is currently configured (test script echoes "No test
specified")
- The project uses pnpm as the package manager
- Node.js version requirement: >=20

## Architecture Overview

This is a TypeScript library that provides typed abstractions for Firestore in
server environments (firebase-admin and firebase-functions). The library is
organized into three main areas:

### Core Modules

1. **Documents (`src/documents/`)** - Functions for single document operations

- CRUD operations: get, set, update, delete
- Transaction variants (suffixed with `Tx`)
- Support for both typed collections and specific document references
- Returns `FsMutableDocument<T>` which combines data with typed update/delete
methods

2. **Collections (`src/collections/`)** - Functions for collection queries and
batch processing

- Query functions: `getDocuments`, `getFirstDocument`
- Processing functions: `processDocuments`, `processDocumentsByChunk`
- Internal chunking for handling large collections (paginated fetching)
- Support for typed select statements that narrow both data and types

3. **Functions (`src/functions/`)** - Cloud Functions utilities
- Helpers to extract typed data from 2nd gen cloud function events
- Functions like `getDataOnWritten`, `getBeforeAndAfterOnUpdated`
- Exported separately as `@typed-firestore/server/functions` to make
firebase-admin optional

### Key Design Patterns

1. **Transaction Support**: All document operations have transaction variants
(suffix `Tx`) that work within Firebase transactions. Transaction functions
return the Transaction object for chaining.

2. **Type Narrowing with Select**: The library supports TypeScript type
narrowing when using select statements. Select must be defined separately
from the query to enable proper type inference.

3. **Chunked Processing**: Collection operations internally use pagination to
handle unlimited documents with constant memory usage. Setting a limit on a
query disables pagination.

4. **Mutable Documents**: The library returns `FsMutableDocument<T>` objects
that combine:

- `id`: Document ID
- `data`: Typed document data
- `ref`: Original Firestore reference
- `update()`/`updateWithPartial()`: Typed update methods
- `delete()`: Delete method

5. **Path Aliases**: The codebase uses `~` as a path alias for the src directory
(configured via tsc-alias).

### Type System

The library provides strong typing throughout:

- Collection references are typed as `CollectionReference<T>`
- Documents are returned as `FsDocument<T>` or `FsMutableDocument<T>`
- Select statements narrow types to `Pick<T, K>`
- Transaction variants have different return types for proper chaining

### Build System

- Uses tsup for bundling JavaScript
- TypeScript compiler for declaration files
- tsc-alias for resolving path aliases in output
- Outputs both ESM modules and TypeScript declarations
- Separate exports for main library and functions submodule

### Code Conventions

- Comments are written in JSDoc style, unless they are inline comments with code
on the same line.
- Use `/** ... */` for both multi-line and single-line comments.
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@typed-firestore/server",
"version": "2.0.0",
"version": "2.1.0-0",
"description": "Elegant, typed abstractions for Firestore in server environments",
"repository": {
"type": "git",
Expand Down
15 changes: 14 additions & 1 deletion src/collections/get-documents.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,11 @@ import { makeMutableDocument, makeMutableDocumentTx } from "~/documents";
import type { FsMutableDocument, FsMutableDocumentTx } from "~/types";
import { invariant } from "~/utils";
import { MAX_QUERY_LIMIT } from "./constants";
import { buildQuery, getDocumentsChunked } from "./helpers";
import {
buildQuery,
getDocumentsChunked,
getDocumentsChunkedWithLimit,
} from "./helpers";
import type { QueryBuilder, SelectedDocument } from "./types";

export type GetDocumentsOptions<
Expand All @@ -35,6 +39,7 @@ export async function getDocuments<
);

if (disableChunking) {
/** For limits <= MAX_QUERY_LIMIT, use a single query (existing behavior) */
invariant(
limit && limit <= MAX_QUERY_LIMIT,
`Limit ${String(limit)} is greater than the maximum query limit of ${String(MAX_QUERY_LIMIT)}`
Expand All @@ -47,7 +52,15 @@ export async function getDocuments<
doc as QueryDocumentSnapshot<SelectedDocument<T, S>>
)
);
} else if (limit && limit > MAX_QUERY_LIMIT) {
/** For limits > MAX_QUERY_LIMIT, use chunking with the specified limit */
return getDocumentsChunkedWithLimit<SelectedDocument<T, S>, T>(
query,
limit,
options.chunkSize
);
} else {
/** No limit specified, get all documents using chunking */
return getDocumentsChunked<SelectedDocument<T, S>, T>(
query,
options.chunkSize
Expand Down
11 changes: 5 additions & 6 deletions src/collections/helpers/build-query.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,17 +17,16 @@ export function buildQuery<T extends DocumentData>(
const queryInfo = queryFn ? getQueryInfo(queryFn(ref)) : {};
const { limit, select: querySelect } = queryInfo;

invariant(
!limit || limit <= MAX_QUERY_LIMIT,
`Limit ${String(limit)} is greater than the maximum query limit of ${String(MAX_QUERY_LIMIT)}`
);

invariant(
!querySelect,
"Select is not allowed to be set on the query. Use the options instead."
);

const disableChunking = isDefined(limit);
/**
* Disable chunking only for limits <= MAX_QUERY_LIMIT. For limits >
* MAX_QUERY_LIMIT, we'll use chunking
*/
const disableChunking = isDefined(limit) && limit <= MAX_QUERY_LIMIT;

const baseQuery = queryFn ? queryFn(ref) : ref;

Expand Down
53 changes: 53 additions & 0 deletions src/collections/helpers/get-documents-chunked-with-limit.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
import type {
DocumentData,
Query,
QueryDocumentSnapshot,
} from "firebase-admin/firestore";
import type { FsMutableDocument } from "~/types";
import { verboseCount } from "~/utils";
import { DEFAULT_CHUNK_SIZE, MAX_QUERY_LIMIT } from "../constants";
import { getChunkOfDocuments } from "./get-chunk-of-documents";

/**
* Gets documents from a query with a specified limit, using pagination when the
* limit exceeds Firestore's maximum query limit of 1000.
*/
export async function getDocumentsChunkedWithLimit<
T extends DocumentData,
TFull extends DocumentData,
>(
query: Query,
totalLimit: number,
chunkSize = DEFAULT_CHUNK_SIZE
): Promise<FsMutableDocument<T, TFull>[]> {
const documents: FsMutableDocument<T, TFull>[] = [];
let startAfterSnapshot: QueryDocumentSnapshot<T> | undefined;
let remainingLimit = totalLimit;

do {
verboseCount("Fetching chunk");

const currentChunkSize = Math.min(
remainingLimit,
Math.min(MAX_QUERY_LIMIT, chunkSize)
);

const [chunk, lastSnapshot] = await getChunkOfDocuments<T, TFull>(
query,
startAfterSnapshot,
currentChunkSize
);

documents.push(...chunk);
remainingLimit -= chunk.length;
startAfterSnapshot = lastSnapshot;

/** Stop if we've reached the limit or there are no more documents */
if (remainingLimit <= 0 || !lastSnapshot) {
break;
}
} while (startAfterSnapshot);

/** Ensure we don't return more than the requested limit */
return documents.slice(0, totalLimit);
}
1 change: 1 addition & 0 deletions src/collections/helpers/index.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
export * from "./build-query";
export * from "./get-all-documents-chunked";
export * from "./get-chunk-of-documents";
export * from "./get-documents-chunked-with-limit";
export * from "./get-query-info";
38 changes: 35 additions & 3 deletions src/collections/process-documents.ts
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ export async function processDocuments<
let errorCount = 0;

if (disableChunking) {
/** For limits <= MAX_QUERY_LIMIT, use a single query (existing behavior) */
invariant(
limit && limit <= MAX_QUERY_LIMIT,
`Limit ${String(limit)} is greater than the maximum query limit of ${String(MAX_QUERY_LIMIT)}`
Expand All @@ -85,18 +86,26 @@ export async function processDocuments<
{ throttleSeconds, chunkSize }
);
} else {
/** For limits > MAX_QUERY_LIMIT or no limit, use chunking */
let lastDocumentSnapshot:
| QueryDocumentSnapshot<SelectedDocument<T, S>>
| undefined;
let count = 0;
const hasLimit = limit && limit > MAX_QUERY_LIMIT;
const remainingLimit = hasLimit ? limit : undefined;

do {
verboseCount("Processing chunk");

/** Calculate effective chunk size based on remaining limit */
const effectiveChunkSize = remainingLimit
? Math.min(remainingLimit - count, Math.min(MAX_QUERY_LIMIT, chunkSize))
: Math.min(MAX_QUERY_LIMIT, chunkSize);

const [documents, _lastDocumentSnapshot] = await getChunkOfDocuments<
SelectedDocument<T, S>,
T
>(query, lastDocumentSnapshot, chunkSize);
>(query, lastDocumentSnapshot, effectiveChunkSize);

await processInChunks(
documents,
Expand All @@ -115,6 +124,11 @@ export async function processDocuments<

count += documents.length;
lastDocumentSnapshot = _lastDocumentSnapshot;

/** Stop if we've reached the limit */
if (remainingLimit && count >= remainingLimit) {
break;
}
} while (isDefined(lastDocumentSnapshot));

verboseLog(`Processed ${String(count)} documents`);
Expand Down Expand Up @@ -145,13 +159,18 @@ export async function processDocumentsByChunk<
) => Promise<unknown>,
options: ProcessDocumentsOptions<T, S> = {}
) {
const { query, disableChunking } = buildQuery(ref, queryFn, options.select);
const { query, disableChunking, limit } = buildQuery(
ref,
queryFn,
options.select
);

const { throttleSeconds = 0, chunkSize = DEFAULT_CHUNK_SIZE } = options;

const errors: string[] = [];

if (disableChunking) {
/** For limits <= MAX_QUERY_LIMIT, use a single query (existing behavior) */
const documents = await getDocuments(ref, queryFn, options);

try {
Expand All @@ -166,18 +185,26 @@ export async function processDocumentsByChunk<
errors.push(getErrorMessage(err));
}
} else {
/** For limits > MAX_QUERY_LIMIT or no limit, use chunking */
let lastDocumentSnapshot:
| QueryDocumentSnapshot<SelectedDocument<T, S>>
| undefined;
let count = 0;
const hasLimit = limit && limit > MAX_QUERY_LIMIT;
const remainingLimit = hasLimit ? limit : undefined;

do {
verboseCount("Processing chunk");

/** Calculate effective chunk size based on remaining limit */
const effectiveChunkSize = remainingLimit
? Math.min(remainingLimit - count, Math.min(MAX_QUERY_LIMIT, chunkSize))
: Math.min(MAX_QUERY_LIMIT, chunkSize);

const [documents, _lastDocumentSnapshot] = await getChunkOfDocuments<
SelectedDocument<T, S>,
T
>(query, lastDocumentSnapshot, chunkSize);
>(query, lastDocumentSnapshot, effectiveChunkSize);

try {
await processInChunksByChunk(
Expand All @@ -193,6 +220,11 @@ export async function processDocumentsByChunk<

count += documents.length;
lastDocumentSnapshot = _lastDocumentSnapshot;

/** Stop if we've reached the limit */
if (remainingLimit && count >= remainingLimit) {
break;
}
} while (isDefined(lastDocumentSnapshot));

verboseLog(`Processed ${String(count)} documents`);
Expand Down