refactor: migrate from Prisma to Drizzle ORM#62
Conversation
Prismaからの脱却により以下の問題を解決: - postinstall時のprisma generateが不要に - ユーザー環境のPrismaバージョンとの競合が解消 - プラットフォーム固有のネイティブバイナリへの依存を排除 - パッケージサイズを~400KB+から23.8kBに削減 変更内容: - DrizzleMemberRepository/DrizzleEventRepositoryを実装 - schema.tsで7テーブルとリレーションを定義 - client.tsでシングルトンプールを実装 - Prisma関連ファイル・依存関係を全削除 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Use InferSelectModel for schema-derived types - Add explicit type definitions with JSDoc comments - Extract reusable query configurations - Refactor methods for better organization Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove all `as` type casts from repositories - Use $inferSelect types that match Drizzle's query results - Let TypeScript infer types from actual query structure Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Delete prisma folder (schema and migrations) - Delete docs/migration.md (Prisma-specific docs) - Update README.md to use Drizzle commands - Regenerate package-lock.json without Prisma Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- drizzle-kit introspect で本番DBからスキーマを生成 - uuid → text、TIMESTAMP(3) など本番と一致する型に変更 - discordAccounts の id → discordId に修正 - タイムスタンプ型が string なので変換処理を追加 - Prisma マイグレーション履歴を参照用に復元 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
本番DBスキーマに合わせたマイグレーションファイルを生成 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
There was a problem hiding this comment.
Pull request overview
This pull request migrates the database layer from Prisma ORM to Drizzle ORM to address npm package distribution issues with Prisma's postinstall hook requirement.
Changes:
- Replaced Prisma Client with Drizzle ORM using node-postgres driver
- Introspected schema from production database to maintain type safety
- Updated repository implementations for Member and Event domain entities
- Modified package scripts and removed postinstall hook
- Updated documentation to reflect new migration commands
Reviewed changes
Copilot reviewed 19 out of 20 changed files in this pull request and generated 11 comments.
Show a summary per file
| File | Description |
|---|---|
| src/infrastructure/drizzle/schema.ts | New Drizzle schema introspected from production database with table definitions and relations |
| src/infrastructure/drizzle/client.ts | Database connection management with singleton Pool pattern |
| src/infrastructure/drizzle/DrizzleMemberRepository.ts | Member repository implementation using Drizzle query API |
| src/infrastructure/drizzle/DrizzleEventRepository.ts | Event repository implementation with complex nested entity management |
| src/infrastructure/drizzle/index.ts | Exports for new Drizzle infrastructure |
| src/executable/member.ts | Updated to use DrizzleMemberRepository |
| src/executable/event.ts | Updated to use DrizzleEventRepository |
| package.json | Replaced Prisma dependencies with Drizzle, updated scripts, removed postinstall hook |
| package-lock.json | Dependency lockfile updates reflecting ORM migration |
| drizzle.config.ts | Drizzle Kit configuration for migrations |
| drizzle/ | Migration files and metadata for Drizzle |
| README.md | Updated database setup instructions for Drizzle |
| prisma/schema.prisma | Removed Prisma schema file |
| src/infrastructure/prisma/ | Removed Prisma repository implementations |
| docs/migration.md | Removed Prisma-specific migration documentation |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| createdAt: timestamp({ precision: 3, mode: "string" }) | ||
| .default(sql`CURRENT_TIMESTAMP`) | ||
| .notNull(), | ||
| updatedAt: timestamp({ precision: 3, mode: "string" }).notNull(), |
There was a problem hiding this comment.
The timestamp fields are using mode "string" which returns timestamps as ISO strings instead of JavaScript Date objects. This is inconsistent with how the repositories are handling dates (converting to Date in toDomain methods). Consider using mode "date" for better type consistency, or document this design decision if it's intentional.
| "files": [ | ||
| "dist", | ||
| "prisma" | ||
| "dist" |
There was a problem hiding this comment.
The "prisma" directory has been removed from the "files" array, but the migration approach means that Prisma migration history is still in the repository (in prisma/migrations). Consider whether this directory should still be included for reference, or if it should be completely removed from the repository as well.
| "dist" | |
| "dist", | |
| "prisma" |
| let pool: Pool | null = null; | ||
|
|
||
| function getPool(): Pool { | ||
| if (!pool) { | ||
| const connectionString = process.env.DATABASE_URL; | ||
| if (!connectionString) { | ||
| throw new Error("DATABASE_URL environment variable is not set"); | ||
| } | ||
| pool = new Pool({ connectionString }); | ||
| } | ||
| return pool; | ||
| } |
There was a problem hiding this comment.
The connection pool is initialized as a singleton but never properly cleaned up. In long-running applications, this is generally fine, but for library code distributed as an npm package, consider providing a cleanup function to allow consumers to properly close the connection pool when needed. This is especially important for testing scenarios or when the application needs to gracefully shut down.
| export function getDb() { | ||
| return drizzle(getPool(), { schema }); | ||
| } |
There was a problem hiding this comment.
The getDb function creates a new Drizzle instance on every call, which may lead to inefficient behavior. Consider caching the Drizzle instance similar to how the Pool is cached, or document if this is intentional for specific reasons.
| // Upsert member | ||
| const now = new Date().toISOString(); | ||
| await db | ||
| .insert(members) | ||
| .values({ | ||
| id: snapshot.id, | ||
| name: snapshot.name, | ||
| studentId: snapshot.studentId, | ||
| department: snapshot.department.getValue(), | ||
| email: snapshot.email.getValue(), | ||
| personalEmail: snapshot.personalEmail?.getValue() ?? null, | ||
| updatedAt: now, | ||
| }) | ||
| .onConflictDoUpdate({ | ||
| target: members.id, | ||
| set: { | ||
| name: snapshot.name, | ||
| studentId: snapshot.studentId, | ||
| department: snapshot.department.getValue(), | ||
| email: snapshot.email.getValue(), | ||
| personalEmail: snapshot.personalEmail?.getValue() ?? null, | ||
| updatedAt: now, | ||
| }, | ||
| }); | ||
|
|
||
| // Upsert discord accounts | ||
| for (const discordAccount of snapshot.discordAccounts) { | ||
| await db | ||
| .insert(discordAccounts) | ||
| .values({ | ||
| discordId: discordAccount.id, | ||
| nickName: discordAccount.nickName, | ||
| memberId: discordAccount.memberId, | ||
| updatedAt: now, | ||
| }) | ||
| .onConflictDoUpdate({ | ||
| target: discordAccounts.discordId, | ||
| set: { | ||
| nickName: discordAccount.nickName, | ||
| updatedAt: now, | ||
| }, | ||
| }); | ||
| } | ||
| } | ||
|
|
||
| async delete(memberId: string): Promise<void> { | ||
| const db = getDb(); | ||
|
|
||
| await db | ||
| .delete(discordAccounts) | ||
| .where(eq(discordAccounts.memberId, memberId)); | ||
| await db.delete(members).where(eq(members.id, memberId)); |
There was a problem hiding this comment.
The save method performs multiple sequential database operations without transaction management. If any discord account upsert fails after the member upsert succeeds, the data will be in an inconsistent state. Consider wrapping these operations in a transaction to ensure atomicity.
| // Upsert member | |
| const now = new Date().toISOString(); | |
| await db | |
| .insert(members) | |
| .values({ | |
| id: snapshot.id, | |
| name: snapshot.name, | |
| studentId: snapshot.studentId, | |
| department: snapshot.department.getValue(), | |
| email: snapshot.email.getValue(), | |
| personalEmail: snapshot.personalEmail?.getValue() ?? null, | |
| updatedAt: now, | |
| }) | |
| .onConflictDoUpdate({ | |
| target: members.id, | |
| set: { | |
| name: snapshot.name, | |
| studentId: snapshot.studentId, | |
| department: snapshot.department.getValue(), | |
| email: snapshot.email.getValue(), | |
| personalEmail: snapshot.personalEmail?.getValue() ?? null, | |
| updatedAt: now, | |
| }, | |
| }); | |
| // Upsert discord accounts | |
| for (const discordAccount of snapshot.discordAccounts) { | |
| await db | |
| .insert(discordAccounts) | |
| .values({ | |
| discordId: discordAccount.id, | |
| nickName: discordAccount.nickName, | |
| memberId: discordAccount.memberId, | |
| updatedAt: now, | |
| }) | |
| .onConflictDoUpdate({ | |
| target: discordAccounts.discordId, | |
| set: { | |
| nickName: discordAccount.nickName, | |
| updatedAt: now, | |
| }, | |
| }); | |
| } | |
| } | |
| async delete(memberId: string): Promise<void> { | |
| const db = getDb(); | |
| await db | |
| .delete(discordAccounts) | |
| .where(eq(discordAccounts.memberId, memberId)); | |
| await db.delete(members).where(eq(members.id, memberId)); | |
| // Upsert member and related discord accounts atomically | |
| const now = new Date().toISOString(); | |
| await db.transaction(async (tx) => { | |
| // Upsert member | |
| await tx | |
| .insert(members) | |
| .values({ | |
| id: snapshot.id, | |
| name: snapshot.name, | |
| studentId: snapshot.studentId, | |
| department: snapshot.department.getValue(), | |
| email: snapshot.email.getValue(), | |
| personalEmail: snapshot.personalEmail?.getValue() ?? null, | |
| updatedAt: now, | |
| }) | |
| .onConflictDoUpdate({ | |
| target: members.id, | |
| set: { | |
| name: snapshot.name, | |
| studentId: snapshot.studentId, | |
| department: snapshot.department.getValue(), | |
| email: snapshot.email.getValue(), | |
| personalEmail: snapshot.personalEmail?.getValue() ?? null, | |
| updatedAt: now, | |
| }, | |
| }); | |
| // Upsert discord accounts | |
| for (const discordAccount of snapshot.discordAccounts) { | |
| await tx | |
| .insert(discordAccounts) | |
| .values({ | |
| discordId: discordAccount.id, | |
| nickName: discordAccount.nickName, | |
| memberId: discordAccount.memberId, | |
| updatedAt: now, | |
| }) | |
| .onConflictDoUpdate({ | |
| target: discordAccounts.discordId, | |
| set: { | |
| nickName: discordAccount.nickName, | |
| updatedAt: now, | |
| }, | |
| }); | |
| } | |
| }); | |
| } | |
| async delete(memberId: string): Promise<void> { | |
| const db = getDb(); | |
| await db.transaction(async (tx) => { | |
| await tx | |
| .delete(discordAccounts) | |
| .where(eq(discordAccounts.memberId, memberId)); | |
| await tx.delete(members).where(eq(members.id, memberId)); | |
| }); |
| private async syncMemberEvents( | ||
| db: DrizzleDb, | ||
| eventId: string, | ||
| memberIds: string[], | ||
| ): Promise<void> { | ||
| await db.delete(memberEvents).where(eq(memberEvents.eventId, eventId)); | ||
|
|
||
| const now = new Date().toISOString(); | ||
| for (const memberId of memberIds) { | ||
| await db | ||
| .insert(memberEvents) | ||
| .values({ | ||
| id: randomUUID(), | ||
| memberId, | ||
| eventId, | ||
| updatedAt: now, | ||
| }) | ||
| .onConflictDoNothing(); | ||
| } | ||
| } | ||
|
|
||
| private async syncMemberExhibits( | ||
| db: DrizzleDb, | ||
| exhibitId: string, | ||
| memberIds: string[], | ||
| ): Promise<void> { | ||
| await db | ||
| .delete(memberExhibits) | ||
| .where(eq(memberExhibits.exhibitId, exhibitId)); | ||
|
|
||
| const now = new Date().toISOString(); | ||
| for (const memberId of memberIds) { | ||
| await db | ||
| .insert(memberExhibits) | ||
| .values({ | ||
| id: randomUUID(), | ||
| memberId, | ||
| exhibitId, | ||
| updatedAt: now, | ||
| }) | ||
| .onConflictDoNothing(); | ||
| } | ||
| } |
There was a problem hiding this comment.
The syncMemberEvents and syncMemberExhibits methods use a delete-then-insert pattern with individual inserts in a loop. This is inefficient and could lead to performance issues with many members. Consider using a single batch insert operation after the delete, or better yet, implement a more efficient differential sync that only inserts/deletes changed records.
| const exhibitRecords = await db | ||
| .select({ id: exhibits.id }) | ||
| .from(exhibits) | ||
| .where(eq(exhibits.eventId, eventId)); | ||
| const exhibitIds = exhibitRecords.map((ex) => ex.id); | ||
|
|
||
| if (exhibitIds.length > 0) { | ||
| await db | ||
| .delete(lightningTalks) | ||
| .where(inArray(lightningTalks.exhibitId, exhibitIds)); | ||
| await db | ||
| .delete(memberExhibits) | ||
| .where(inArray(memberExhibits.exhibitId, exhibitIds)); | ||
| } | ||
|
|
||
| await db.delete(memberEvents).where(eq(memberEvents.eventId, eventId)); | ||
| await db.delete(exhibits).where(eq(exhibits.eventId, eventId)); | ||
| await db.delete(events).where(eq(events.id, eventId)); |
There was a problem hiding this comment.
The delete method performs multiple sequential delete operations without transaction management. If any delete operation fails partway through, related data may be left in an inconsistent state. Consider wrapping all delete operations in a database transaction to ensure atomicity.
| const exhibitRecords = await db | |
| .select({ id: exhibits.id }) | |
| .from(exhibits) | |
| .where(eq(exhibits.eventId, eventId)); | |
| const exhibitIds = exhibitRecords.map((ex) => ex.id); | |
| if (exhibitIds.length > 0) { | |
| await db | |
| .delete(lightningTalks) | |
| .where(inArray(lightningTalks.exhibitId, exhibitIds)); | |
| await db | |
| .delete(memberExhibits) | |
| .where(inArray(memberExhibits.exhibitId, exhibitIds)); | |
| } | |
| await db.delete(memberEvents).where(eq(memberEvents.eventId, eventId)); | |
| await db.delete(exhibits).where(eq(exhibits.eventId, eventId)); | |
| await db.delete(events).where(eq(events.id, eventId)); | |
| await db.transaction(async (tx) => { | |
| const exhibitRecords = await tx | |
| .select({ id: exhibits.id }) | |
| .from(exhibits) | |
| .where(eq(exhibits.eventId, eventId)); | |
| const exhibitIds = exhibitRecords.map((ex) => ex.id); | |
| if (exhibitIds.length > 0) { | |
| await tx | |
| .delete(lightningTalks) | |
| .where(inArray(lightningTalks.exhibitId, exhibitIds)); | |
| await tx | |
| .delete(memberExhibits) | |
| .where(inArray(memberExhibits.exhibitId, exhibitIds)); | |
| } | |
| await tx | |
| .delete(memberEvents) | |
| .where(eq(memberEvents.eventId, eventId)); | |
| await tx.delete(exhibits).where(eq(exhibits.eventId, eventId)); | |
| await tx.delete(events).where(eq(events.id, eventId)); | |
| }); |
| export default defineConfig({ | ||
| out: "./drizzle", | ||
| schema: "./src/infrastructure/drizzle/schema.ts", | ||
| dialect: "postgresql", | ||
| dbCredentials: { | ||
| url: process.env.DATABASE_URL!, |
There was a problem hiding this comment.
The configuration uses a non-null assertion operator on process.env.DATABASE_URL. If this environment variable is not set, the error will only occur at runtime when drizzle-kit is executed. Consider adding explicit validation with a helpful error message, or handle the case where DATABASE_URL might be undefined.
| export default defineConfig({ | |
| out: "./drizzle", | |
| schema: "./src/infrastructure/drizzle/schema.ts", | |
| dialect: "postgresql", | |
| dbCredentials: { | |
| url: process.env.DATABASE_URL!, | |
| const databaseUrl = process.env.DATABASE_URL; | |
| if (!databaseUrl) { | |
| throw new Error( | |
| "Environment variable DATABASE_URL is not set. Please set it before running drizzle-kit." | |
| ); | |
| } | |
| export default defineConfig({ | |
| out: "./drizzle", | |
| schema: "./src/infrastructure/drizzle/schema.ts", | |
| dialect: "postgresql", | |
| dbCredentials: { | |
| url: databaseUrl, |
| updatedAt: now, | ||
| }, | ||
| }); | ||
| } |
There was a problem hiding this comment.
The save method does not handle removal of discord accounts. If a member previously had discord accounts that are now removed from the snapshot, they will remain in the database. The old Prisma implementation had the same limitation, but this should be addressed in the new implementation.
| } | |
| } | |
| // Remove discord accounts that are no longer present in the snapshot | |
| const existingDiscordAccounts = await db | |
| .select({ | |
| discordId: discordAccounts.discordId, | |
| memberId: discordAccounts.memberId, | |
| }) | |
| .from(discordAccounts) | |
| .where(eq(discordAccounts.memberId, snapshot.id)); | |
| const snapshotDiscordIds = new Set( | |
| snapshot.discordAccounts.map((discordAccount) => discordAccount.id), | |
| ); | |
| for (const existing of existingDiscordAccounts) { | |
| if (!snapshotDiscordIds.has(existing.discordId)) { | |
| await db | |
| .delete(discordAccounts) | |
| .where(eq(discordAccounts.discordId, existing.discordId)); | |
| } | |
| } |
| const snapshot = event.toSnapshot(); | ||
| const now = new Date().toISOString(); | ||
| const dateStr = | ||
| snapshot.date instanceof Date | ||
| ? snapshot.date.toISOString() | ||
| : snapshot.date; | ||
|
|
||
| // 1) Event upsert | ||
| await db | ||
| .insert(events) | ||
| .values({ | ||
| id: snapshot.id, | ||
| name: snapshot.name, | ||
| date: dateStr, | ||
| updatedAt: now, | ||
| }) | ||
| .onConflictDoUpdate({ | ||
| target: events.id, | ||
| set: { | ||
| name: snapshot.name, | ||
| date: dateStr, | ||
| updatedAt: now, | ||
| }, | ||
| }); | ||
|
|
||
| // 2) Find obsolete exhibits and clean up | ||
| const snapshotExhibitIds = snapshot.exhibits.map((ex) => ex.id); | ||
| await this.deleteObsoleteExhibits(db, snapshot.id, snapshotExhibitIds); | ||
|
|
||
| // 3) Upsert exhibits | ||
| for (const ex of snapshot.exhibits) { | ||
| await this.upsertExhibit(db, snapshot.id, ex); | ||
| } | ||
|
|
||
| // 4) Sync member events | ||
| await this.syncMemberEvents(db, snapshot.id, event.getMemberIds()); | ||
|
|
||
| // 5) Sync member exhibits | ||
| for (const exhibitDomain of event.getExhibits()) { | ||
| await this.syncMemberExhibits( | ||
| db, | ||
| exhibitDomain.id, | ||
| exhibitDomain.getMemberIds(), | ||
| ); | ||
| } |
There was a problem hiding this comment.
The persistEvent method performs multiple sequential database operations (event upsert, exhibit cleanup, exhibit upserts, member event sync, member exhibit sync) without transaction management. If any operation fails partway through, the event data will be in an inconsistent state. This is a critical issue that should be addressed by wrapping all operations in a database transaction to ensure atomicity.
| const snapshot = event.toSnapshot(); | |
| const now = new Date().toISOString(); | |
| const dateStr = | |
| snapshot.date instanceof Date | |
| ? snapshot.date.toISOString() | |
| : snapshot.date; | |
| // 1) Event upsert | |
| await db | |
| .insert(events) | |
| .values({ | |
| id: snapshot.id, | |
| name: snapshot.name, | |
| date: dateStr, | |
| updatedAt: now, | |
| }) | |
| .onConflictDoUpdate({ | |
| target: events.id, | |
| set: { | |
| name: snapshot.name, | |
| date: dateStr, | |
| updatedAt: now, | |
| }, | |
| }); | |
| // 2) Find obsolete exhibits and clean up | |
| const snapshotExhibitIds = snapshot.exhibits.map((ex) => ex.id); | |
| await this.deleteObsoleteExhibits(db, snapshot.id, snapshotExhibitIds); | |
| // 3) Upsert exhibits | |
| for (const ex of snapshot.exhibits) { | |
| await this.upsertExhibit(db, snapshot.id, ex); | |
| } | |
| // 4) Sync member events | |
| await this.syncMemberEvents(db, snapshot.id, event.getMemberIds()); | |
| // 5) Sync member exhibits | |
| for (const exhibitDomain of event.getExhibits()) { | |
| await this.syncMemberExhibits( | |
| db, | |
| exhibitDomain.id, | |
| exhibitDomain.getMemberIds(), | |
| ); | |
| } | |
| await db.transaction(async (tx) => { | |
| const snapshot = event.toSnapshot(); | |
| const now = new Date().toISOString(); | |
| const dateStr = | |
| snapshot.date instanceof Date | |
| ? snapshot.date.toISOString() | |
| : snapshot.date; | |
| // 1) Event upsert | |
| await tx | |
| .insert(events) | |
| .values({ | |
| id: snapshot.id, | |
| name: snapshot.name, | |
| date: dateStr, | |
| updatedAt: now, | |
| }) | |
| .onConflictDoUpdate({ | |
| target: events.id, | |
| set: { | |
| name: snapshot.name, | |
| date: dateStr, | |
| updatedAt: now, | |
| }, | |
| }); | |
| // 2) Find obsolete exhibits and clean up | |
| const snapshotExhibitIds = snapshot.exhibits.map((ex) => ex.id); | |
| await this.deleteObsoleteExhibits(tx, snapshot.id, snapshotExhibitIds); | |
| // 3) Upsert exhibits | |
| for (const ex of snapshot.exhibits) { | |
| await this.upsertExhibit(tx, snapshot.id, ex); | |
| } | |
| // 4) Sync member events | |
| await this.syncMemberEvents(tx, snapshot.id, event.getMemberIds()); | |
| // 5) Sync member exhibits | |
| for (const exhibitDomain of event.getExhibits()) { | |
| await this.syncMemberExhibits( | |
| tx, | |
| exhibitDomain.id, | |
| exhibitDomain.getMemberIds(), | |
| ); | |
| } | |
| }); |
Prisma was replaced with Drizzle ORM in #62. These migration files are no longer needed. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Why
Prismaはnpmパッケージとして配布する際に問題がある(postinstallでの生成が必要など)
What
🤖 Generated with Claude Code