-
Notifications
You must be signed in to change notification settings - Fork 166
Description
The main question of this issue is:
How to use cleanly Tanstack DB with Supabase ?
I’ve been exploring a createCollection helper for Supabase to make it straightforward to build TanStack/db collections backed by Supabase tables. The goal is to replicate the Supabase table structure ergonomically and support filtering/sorting/limits, inserts/updates/deletes, and optionally simple joins. This could be a big win for Supabase users, but there are a lot of limitations right now—mostly around type inference and join handling. I’m opening this to discuss direction and alternatives.
Checklist
- I’ve validated against latest DB packages
What I built
- A generic wrapper function
createSupabaseCollectionthat:- Adapts
queryCollectionOptionsto Supabase queries - Maps common filter operators (eq/neq/gt/gte/lt/lte/in/contains)
- Applies sorts/limits
- Supports transforms for insert/update payloads
- Allows normalization of returned rows
- Keeps the API close to TanStack/db’s collection model
- Adapts
Usage example
import { createSupabaseCollection } from "./db/supabase-collection";
import { createClient } from "@/utils/supabase/client";
export const usersCollection = createSupabaseCollection("users", {
client: createClient,
});Generic wrapper (Draft)
Generic wrapper: createSupabaseCollection
// file: db/supabase-collection.ts
import type { SupabaseClient } from "@supabase/supabase-js";
import {
parseLoadSubsetOptions,
queryCollectionOptions,
} from "@tanstack/query-db-collection";
import { type Collection, createCollection } from "@tanstack/react-db";
import { getBrowserQueryClient } from "@/lib/query-client-instance";
type GenericRelationship = {
foreignKeyName: string;
columns: string[];
isOneToOne?: boolean;
referencedRelation: string;
referencedColumns: string[];
};
type GenericTable = {
Row: Record<string, unknown>;
Insert: Record<string, unknown>;
Update: Record<string, unknown>;
Relationships: GenericRelationship[];
};
type GenericView = {
Row: Record<string, unknown>;
Insert?: Record<string, unknown>;
Update?: Record<string, unknown>;
Relationships: GenericRelationship[];
};
type GenericFunction = {
Args: Record<string, unknown>;
Returns: unknown;
};
type GenericSchema = {
Tables: Record<string, GenericTable>;
Views: Record<string, GenericView>;
Functions: Record<string, GenericFunction>;
};
type GenericDatabase = {
public: GenericSchema;
};
type LoadSubsetOptions = ReturnType<typeof parseLoadSubsetOptions>;
interface CreateSupabaseCollectionOptions<
TDB extends GenericDatabase,
TName extends keyof TDB["public"]["Tables"] & string,
TData,
TInsert,
TUpdate,
> {
client: () => SupabaseClient<TDB>;
select?: string;
idField?: keyof TData & string;
normalize?: (data: TDB["public"]["Tables"][TName]["Row"]) => TData;
transformInsert?: (data: TInsert) => TDB["public"]["Tables"][TName]["Insert"];
transformUpdate?: (data: TUpdate) => TDB["public"]["Tables"][TName]["Update"];
}
export const createSupabaseCollection = <
TDB extends GenericDatabase,
TName extends keyof TDB["public"]["Tables"] & string,
TData extends Record<string, unknown> = TDB["public"]["Tables"][TName]["Row"],
TInsert extends Record<string, unknown> = TDB["public"]["Tables"][TName]["Insert"],
TUpdate extends Record<string, unknown> = TDB["public"]["Tables"][TName]["Update"],
>(
tableName: TName,
options: CreateSupabaseCollectionOptions<TDB, TName, TData, TInsert, TUpdate>,
) => {
const {
client,
select = "*",
idField = "id" as keyof TData & string,
normalize = (d) => d as unknown as TData,
transformInsert = (d) =>
d as unknown as TDB["public"]["Tables"][TName]["Insert"],
transformUpdate = (d) =>
d as unknown as TDB["public"]["Tables"][TName]["Update"],
} = options;
return createCollection(
queryCollectionOptions<TData>({
queryKey: [tableName],
queryClient: getBrowserQueryClient(),
getKey: (item) => item[idField] as string | number,
syncMode: "on-demand",
queryFn: async (ctx) => {
const supabase = client();
const { filters, sorts, limit } = parseLoadSubsetOptions(
ctx.meta?.loadSubsetOptions as LoadSubsetOptions,
);
let query = supabase.from(tableName).select(select);
if (filters && filters.length > 0) {
filters.forEach((filter) => {
const { field, operator, value } = filter;
const fieldName = Array.isArray(field) ? field.join(".") : field;
switch (operator) {
case "eq":
query = query.eq(fieldName, value);
break;
case "neq":
query = query.neq(fieldName, value);
break;
case "gt":
query = query.gt(fieldName, value);
break;
case "gte":
query = query.gte(fieldName, value);
break;
case "lt":
query = query.lt(fieldName, value);
break;
case "lte":
query = query.lte(fieldName, value);
break;
case "in":
if (Array.isArray(value)) {
query = query.in(fieldName, value);
}
break;
case "contains":
query = query.ilike(fieldName, `%${value}%`);
break;
}
});
}
if (sorts && sorts.length > 0) {
sorts.forEach((sort) => {
const { field, direction } = sort;
const fieldName = Array.isArray(field) ? field.join(".") : field;
query = query.order(fieldName, { ascending: direction === "asc" });
});
}
if (limit) {
query = query.limit(limit);
}
const { data, error } = await query;
if (error) throw error;
return (data as unknown as TDB["public"]["Tables"][TName]["Row"][]).map(
normalize,
);
},
onInsert: async ({ transaction }) => {
const supabase = client();
let result: TData | undefined;
for (const mutation of transaction.mutations) {
const payload = transformInsert(
mutation.modified as unknown as TInsert,
);
const { data, error } = await supabase
.from(tableName)
.insert(payload as any)
.select(select)
.single();
if (error) throw error;
result = normalize(
data as unknown as TDB["public"]["Tables"][TName]["Row"],
);
}
return result;
},
onUpdate: async ({ transaction }) => {
const supabase = client();
let result: TData | undefined;
for (const mutation of transaction.mutations) {
const payload = transformUpdate(
mutation.changes as unknown as TUpdate,
);
if (Object.keys(payload as object).length === 0) continue;
const id =
(mutation as any).rowId || (mutation.modified as any)[idField];
if (!id) throw new Error(`Missing ${idField} for update`);
const { data, error } = await supabase
.from(tableName)
.update(payload as any)
.eq(idField, id)
.select(select)
.single();
if (error) throw error;
result = normalize(
data as unknown as TDB["public"]["Tables"][TName]["Row"],
);
}
return result;
},
onDelete: async ({ transaction }) => {
const supabase = client();
for (const mutation of transaction.mutations) {
const id =
(mutation as any).rowId || (mutation as any).original?.[idField];
if (id) {
const { error } = await supabase
.from(tableName)
.delete()
.eq(idField as string, id);
if (error) throw error;
}
}
},
}),
) as unknown as Collection<
TData,
string | number,
any,
any,
TInsert
>;
};Advanced example with join (not sure of the approach)
Advanced example: pages collection with joins and upsert behavior
// file: db/pages-collection.ts
import {
parseLoadSubsetOptions,
queryCollectionOptions,
} from "@tanstack/query-db-collection";
import { createCollection } from "@tanstack/react-db";
import { getBrowserQueryClient } from "@/lib/query-client-instance";
import { createClient } from "@/utils/supabase/client";
import type { Tables } from "@/utils/supabase/database.types";
type PageUser = Pick<Tables<"users">, "profile_slug">;
type PagePublic = Pick<Tables<"pages_public">, "id_page">;
export type Page = Partial<Tables<"pages">> & {
id: string;
created_by: string;
users?: PageUser | null;
pages_public?: PagePublic | null;
};
export const normalizePageRow = (page: any): Page => {
const users = Array.isArray(page.users) ? page.users[0] : page.users;
const pagesPublic = Array.isArray(page.pages_public)
? page.pages_public[0]
: page.pages_public;
return {
...page,
users: users ?? undefined,
pages_public: pagesPublic ?? undefined,
};
};
const performUpsert = async (input: Page) => {
const supabase = createClient();
const {
id,
created_by,
title,
content,
emoji,
deleted_at,
id_parent,
banner_url,
my_day_entry,
type,
} = input;
if (!id || !created_by) {
throw new Error("Missing id or created_by for page upsert");
}
const { data: existingPageById } = await supabase
.from("pages")
.select("id")
.eq("id", id)
.maybeSingle();
if (existingPageById) {
const updateData: any = {
updated_at: new Date().toISOString(),
};
if (title !== undefined) updateData.title = title;
if (content !== undefined) updateData.content = content;
if (emoji !== undefined) updateData.emoji = emoji;
if (deleted_at !== undefined) updateData.deleted_at = deleted_at;
if (id_parent !== undefined) updateData.id_parent = id_parent;
if (banner_url !== undefined) updateData.banner_url = banner_url;
if (my_day_entry !== undefined) updateData.my_day_entry = my_day_entry;
if (type !== undefined) updateData.type = type;
const { data, error } = await supabase
.from("pages")
.update(updateData)
.eq("id", id)
.select(
"*, users!pages_created_by_fkey(profile_slug), pages_public!left(id_page)",
)
.single();
if (error) throw error;
return data;
}
if (my_day_entry) {
const { data: existingMyDayPage } = await supabase
.from("pages")
.select("id")
.eq("my_day_entry", my_day_entry)
.eq("created_by", created_by)
.maybeSingle();
if (existingMyDayPage) {
const updateData: any = {
updated_at: new Date().toISOString(),
};
if (title !== undefined) updateData.title = title;
if (content !== undefined) updateData.content = content;
if (emoji !== undefined) updateData.emoji = emoji;
if (deleted_at !== undefined) updateData.deleted_at = deleted_at;
if (id_parent !== undefined) updateData.id_parent = id_parent;
if (banner_url !== undefined) updateData.banner_url = banner_url;
const { data, error } = await supabase
.from("pages")
.update(updateData)
.eq("id", existingMyDayPage.id)
.select(
"*, users!pages_created_by_fkey(profile_slug), pages_public!left(id_page)",
)
.single();
if (error) throw error;
return data;
}
}
const { data, error } = await supabase
.from("pages")
.insert({
id,
created_by,
created_at: new Date().toISOString(),
updated_at: new Date().toISOString(),
title: title || "New page",
content: content ?? null,
id_parent: id_parent ?? null,
type: type || "page",
banner_url: banner_url ?? null,
emoji: emoji ?? null,
my_day_entry: my_day_entry ?? null,
deleted_at: deleted_at ?? null,
})
.select(
"*, users!pages_created_by_fkey(profile_slug), pages_public!left(id_page)",
)
.single();
if (error) throw error;
return data;
};
export const pagesCollectionOptions = queryCollectionOptions<Page, unknown>({
queryKey: ["pages"],
queryClient: getBrowserQueryClient(),
getKey: (item) => item.id,
queryFn: async (ctx) => {
const supabase = createClient();
const { filters, sorts, limit } = parseLoadSubsetOptions(
ctx.meta?.loadSubsetOptions as any,
);
const isDetailQuery = filters?.some(
(f) => f.field.includes("id") && f.operator === "eq",
);
const isUserQuery = filters?.some((f) =>
f.field.some((part) => part === "users" || part === "profile_slug"),
);
const defaultSelect =
"id, title, emoji, updated_at, created_at, created_by, type, id_parent, deleted_at, my_day_entry, banner_url, pages_public(id_page)";
const detailSelect =
"*, users!pages_created_by_fkey(profile_slug), pages_public(id_page)";
const selectQuery =
isDetailQuery || isUserQuery ? detailSelect : defaultSelect;
let query = supabase.from("pages").select(selectQuery);
if (filters) {
filters.forEach((filter) => {
const { field, operator, value } = filter;
const rawFieldName = field.join(".");
let fieldName = rawFieldName.startsWith("pages.")
? rawFieldName.replace("pages.", "")
: rawFieldName;
if (fieldName.startsWith("users.")) {
fieldName = fieldName.replace(
"users.",
"users!pages_created_by_fkey.",
);
}
switch (operator) {
case "eq":
query = query.eq(fieldName, value);
break;
case "neq":
query = query.neq(fieldName, value);
break;
case "gt":
query = query.gt(fieldName, value);
break;
case "gte":
query = query.gte(fieldName, value);
break;
case "lt":
query = query.lt(fieldName, value);
break;
case "lte":
query = query.lte(fieldName, value);
break;
case "in":
if (Array.isArray(value)) {
query = query.in(fieldName, value);
}
break;
}
});
}
if (sorts && sorts.length > 0) {
sorts.forEach((sort) => {
const { field, direction } = sort;
const rawFieldName = field.join(".");
const sortField = rawFieldName.startsWith("pages.")
? rawFieldName.replace("pages.", "")
: rawFieldName;
query = query.order(sortField, {
ascending: direction === "asc",
});
});
} else {
query = query.order("updated_at", { ascending: false });
}
if (limit) {
query = query.limit(limit);
}
const { data, error } = await query;
if (error) {
console.error("Error fetching pages:", JSON.stringify(error, null, 2));
console.error("Query details:", {
filters,
sorts,
limit,
selectQuery,
});
throw error;
}
const rows = (data ?? []) as any[];
return rows.map((row) => normalizePageRow(row));
},
syncMode: "on-demand",
onInsert: async ({ transaction }) => {
let result: Tables<"pages"> | undefined;
const supabase = createClient();
for (const mutation of transaction.mutations) {
result = await performUpsert(mutation.modified);
if (mutation.modified.pages_public && result) {
await supabase
.from("pages_public")
.upsert({ id_page: result.id }, { onConflict: "id_page" });
(result as any).pages_public = mutation.modified.pages_public;
}
}
return result ? normalizePageRow(result) : undefined;
},
onUpdate: async ({ transaction }) => {
let result: Tables<"pages"> | undefined;
const supabase = createClient();
for (const mutation of transaction.mutations) {
if ("pages_public" in mutation.changes) {
const isPublic = !!mutation.modified.pages_public;
if (isPublic) {
await supabase
.from("pages_public")
.upsert(
{ id_page: mutation.modified.id },
{ onConflict: "id_page" },
);
} else {
await supabase
.from("pages_public")
.delete()
.eq("id_page", mutation.modified.id);
}
}
result = await performUpsert(mutation.modified);
}
return result ? normalizePageRow(result) : undefined;
},
onDelete: async ({ transaction }) => {
const supabase = createClient();
for (const mutation of transaction.mutations) {
const id = (mutation as any).rowId || (mutation as any).original?.id;
if (id) {
await supabase.from("pages").delete().eq("id", id);
}
}
},
});
export const pagesCollection = createCollection(pagesCollectionOptions);Why this matters
- Supabase users could define collections per table rapidly, then layer TanStack/db’s live queries, pagination, and mutation transactions on top.
- If we solve types + joins well, it becomes a near drop-in for many Supabase-backed apps.
Current limitations and pain points
- Type inference
- Supabase’s generated types (or hand-rolled generics) don’t flow nicely through the collection wrapper. There’s a lot of casting to
anyorunknownto bridge gaps. - Insert/Update payload types frequently diverge from Row types; mapping is manual via
transformInsert/transformUpdate. - Relation typing is particularly awkward—joins often come back as arrays (due to 1:n) or single objects (1:1) and need normalization; typing this cleanly is tough without schema-aware helpers.
- Supabase’s generated types (or hand-rolled generics) don’t flow nicely through the collection wrapper. There’s a lot of casting to
- Join queries
- Supabase’s
select()with foreign key aliases (e.g.,users!pages_created_by_fkey(profile_slug)) is powerful but brittle in generic code. - Mapping filter fields onto relation aliases (e.g., converting
users.profile_slug->users!pages_created_by_fkey.profile_slug) is custom per table and hard to generalize. - Left join + where semantics require care to avoid filtering out base rows when the right side is missing. Expressing this via the client API is non-trivial generically.
- Progressive collections and live queries with joins make delete/ready semantics tricky.
- Supabase’s
- Operator coverage
- Only basic operators are implemented. Adding
is,isNull,overlaps, full text search, etc., would require a richer operator-to-SQL mapping.
- Only basic operators are implemented. Adding
Environment
- Next.js + React 19 (+ Compiler)
- TanStack/db
- supabase-js
- TypeScript
- Tailwind CSS v4
- Biome
I’m very open to another approach so long as it lets us create table collections that replicate Supabase structure and handle joins robustly. Happy to iterate and contribute.