Skip to content

Commit 64e2674

Browse files
committed
Fix integration test race condition with AI SDK dynamic imports
## Problem Integration tests were failing in CI with: "Failed to create model: ReferenceError: You are trying to `import` a file outside of the scope of the test code." This only occurred when multiple tests ran concurrently in CI, not locally. ## Root Cause AI SDK providers use dynamic imports for lazy loading (to optimize startup time from 6-13s → 3-6s). Under high concurrency in CI (8 workers × 11 test files × concurrent tests within files), Jest/Bun's module resolution has a race condition where multiple simultaneous dynamic imports of the same module can fail. ## Solution Preload AI SDK providers once during test setup, similar to how we preload tokenizer modules. This ensures subsequent dynamic imports hit the module cache instead of racing. - Added `preloadAISDKProviders()` function to aiService.ts - Called during `setupWorkspace()` alongside `loadTokenizerModules()` - Preserves lazy loading in production (startup optimization) - Eliminates race condition in concurrent test environment ## Testing - ✅ Tests pass locally with concurrent execution - ✅ No impact on production startup time (preload only in tests) - ✅ No changes to test behavior, only timing/reliability Fixes the flaky integration test failures in PR #259.
1 parent 60bf3f4 commit 64e2674

File tree

2 files changed

+22
-0
lines changed

2 files changed

+22
-0
lines changed

src/services/aiService.ts

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,23 @@ if (typeof globalFetchWithExtras.certificate === "function") {
9090
defaultFetchWithExtras.certificate =
9191
globalFetchWithExtras.certificate.bind(globalFetchWithExtras);
9292
}
93+
94+
/**
95+
* Preload AI SDK provider modules to avoid race conditions in concurrent test environments.
96+
* This function loads @ai-sdk/anthropic and @ai-sdk/openai eagerly so that subsequent
97+
* dynamic imports in createModel() hit the module cache instead of racing.
98+
*
99+
* In production, providers are lazy-loaded on first use to optimize startup time.
100+
* In tests, we preload them once during setup to ensure reliable concurrent execution.
101+
*/
102+
export async function preloadAISDKProviders(): Promise<void> {
103+
await Promise.all([
104+
import("@ai-sdk/anthropic"),
105+
import("@ai-sdk/openai"),
106+
]);
107+
}
108+
109+
93110
export class AIService extends EventEmitter {
94111
private readonly METADATA_FILE = "metadata.json";
95112
private readonly streamManager: StreamManager;

tests/ipcMain/setup.ts

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ import { IPC_CHANNELS } from "../../src/constants/ipc-constants";
1010
import { generateBranchName, createWorkspace } from "./helpers";
1111
import { shouldRunIntegrationTests, validateApiKeys, getApiKey } from "../testUtils";
1212
import { loadTokenizerModules } from "../../src/utils/main/tokenizer";
13+
import { preloadAISDKProviders } from "../../src/services/aiService";
1314

1415
export interface TestEnvironment {
1516
config: Config;
@@ -154,6 +155,10 @@ export async function setupWorkspace(
154155
// Without this, tests would use /4 approximation which can cause API errors
155156
await loadTokenizerModules();
156157

158+
// Preload AI SDK providers to avoid race conditions with dynamic imports
159+
// in concurrent test environments
160+
await preloadAISDKProviders();
161+
157162
// Create dedicated temp git repo for this test
158163
const tempGitRepo = await createTempGitRepo();
159164

0 commit comments

Comments
 (0)