Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/prompt-caching.md
Original file line number Diff line number Diff line change
Expand Up @@ -189,3 +189,4 @@ See ecosystem-specific examples:

- **TypeScript + fetch**: [typescript/fetch/src/prompt-caching/](../typescript/fetch/src/prompt-caching/)
- **AI SDK v5** (Vercel): [typescript/ai-sdk-v5/src/prompt-caching/](../typescript/ai-sdk-v5/src/prompt-caching/)
- **Effect AI** (@effect/ai): [typescript/effect-ai/src/prompt-caching/](../typescript/effect-ai/src/prompt-caching/)
48 changes: 48 additions & 0 deletions typescript/effect-ai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# Effect-TS AI Examples

Examples using Effect-TS with @effect/ai and @effect/ai-openrouter for type-safe, composable AI operations.

## Prerequisites

- Bun runtime: `curl -fsSL https://bun.sh/install | bash`
- `OPENROUTER_API_KEY` environment variable

## Running Examples

```bash
# From monorepo root (typescript/)
bun examples

# Or from this workspace
cd effect-ai
bun examples
```

## Features

- [prompt-caching](./src/prompt-caching/) - Anthropic caching examples with Effect patterns

### Key Configuration

**CRITICAL**: The Effect AI example requires:
```typescript
config: {
stream_options: { include_usage: true }
}
```

Without this, `usage.cachedInputTokens` will be undefined in the response.

### Effect Patterns Demonstrated

- `Effect.gen` for generator-based composition
- Layer-based dependency injection
- Type-safe error handling
- Evidence-based validation

## Dependencies

- `@openrouter-examples/shared` - Shared constants (LARGE_SYSTEM_PROMPT) and types
- `@effect/ai` - Effect AI abstractions
- `@effect/ai-openrouter` - OpenRouter provider for Effect AI
- `effect` - Effect-TS core library
21 changes: 21 additions & 0 deletions typescript/effect-ai/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"name": "@openrouter-examples/effect-ai",
"version": "1.0.0",
"private": true,
"type": "module",
"scripts": {
"examples": "bun run run-examples.ts",
"typecheck": "tsc --noEmit"
},
"dependencies": {
"@openrouter-examples/shared": "workspace:*",
"@effect/ai": "^0.32.1",
"@effect/ai-openrouter": "^0.6.0",
"@effect/platform": "^0.93.0",
"@effect/platform-bun": "^0.83.0",
"effect": "^3.19.3"
Comment on lines +12 to +16
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

freeze them deps

},
"devDependencies": {
"@types/bun": "latest"
}
}
57 changes: 57 additions & 0 deletions typescript/effect-ai/run-examples.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
#!/usr/bin/env bun
/**
* Run all example files in the src/ directory
* Each example is run in a separate process to handle process.exit() calls
*/

import { readdirSync, statSync } from 'fs';
import { join } from 'path';
import { $ } from 'bun';

const srcDir = join(import.meta.dir, 'src');

// Recursively find all .ts files in src/
function findExamples(dir: string): string[] {
const entries = readdirSync(dir);
const files: string[] = [];

for (const entry of entries) {
const fullPath = join(dir, entry);
const stat = statSync(fullPath);

if (stat.isDirectory()) {
files.push(...findExamples(fullPath));
} else if (entry.endsWith('.ts')) {
files.push(fullPath);
}
}

return files.sort();
}

const examples = findExamples(srcDir);
console.log(`Found ${examples.length} example(s)\n`);

let failed = 0;
for (const example of examples) {
const relativePath = example.replace(import.meta.dir + '/', '');
console.log(`\n${'='.repeat(80)}`);
console.log(`Running: ${relativePath}`);
console.log('='.repeat(80));

try {
await $`bun run ${example}`.quiet();
console.log(`✅ ${relativePath} completed successfully`);
} catch (error) {
console.error(`❌ ${relativePath} failed`);
failed++;
}
}

console.log(`\n${'='.repeat(80)}`);
console.log(`Results: ${examples.length - failed}/${examples.length} passed`);
console.log('='.repeat(80));

if (failed > 0) {
process.exit(1);
}
49 changes: 49 additions & 0 deletions typescript/effect-ai/src/prompt-caching/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# Prompt Caching Examples (Effect AI)

Examples demonstrating prompt caching with @effect/ai and @effect/ai-openrouter.

## Documentation

For full prompt caching documentation including all providers, pricing, and configuration details, see:
- **[Prompt Caching Guide](../../../../docs/prompt-caching.md)**

## Examples in This Directory

See the TypeScript files in this directory for specific examples.

## Effect AI Usage

```typescript
import * as OpenRouterLanguageModel from '@effect/ai-openrouter/OpenRouterLanguageModel';

const OpenRouterModelLayer = OpenRouterLanguageModel.layer({
model: 'anthropic/claude-3.5-sonnet',
config: {
stream_options: { include_usage: true }, // Required for cache metrics
},
});

const program = Effect.gen(function* () {
const response = yield* LanguageModel.generateText({
prompt: Prompt.make([{
role: 'user',
content: [{
type: 'text',
text: 'Large context...',
options: {
openrouter: { cacheControl: { type: 'ephemeral' } }
}
}]
}])
});

// Check cache metrics
const cached = response.usage.cachedInputTokens ?? 0;
});
```

## Effect-Specific Notes

- Use layer-based dependency injection for client and model configuration
- `stream_options.include_usage` must be set in the model layer config
- Cache metrics appear in `response.usage.cachedInputTokens`
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
/**
* Example: Anthropic Prompt Caching - Multi-Message Conversation (Effect AI)
*
* This example demonstrates Anthropic prompt caching in a multi-message conversation
* via OpenRouter using Effect AI.
*
* Pattern: User message cache in multi-turn conversation using Effect patterns
*/

import * as OpenRouterClient from '@effect/ai-openrouter/OpenRouterClient';
import * as OpenRouterLanguageModel from '@effect/ai-openrouter/OpenRouterLanguageModel';
import * as LanguageModel from '@effect/ai/LanguageModel';
import * as Prompt from '@effect/ai/Prompt';
import { FetchHttpClient } from '@effect/platform';
import * as BunContext from '@effect/platform-bun/BunContext';
import { LARGE_SYSTEM_PROMPT } from '@openrouter-examples/shared/constants';
import { Console, Effect, Layer, Redacted } from 'effect';

const program = Effect.gen(function* () {
const testId = Date.now();
const largeContext = `Test ${testId}: Context:\n\n${LARGE_SYSTEM_PROMPT}`;

yield* Console.log(
'╔════════════════════════════════════════════════════════════════════════════╗',
);
yield* Console.log(
'║ Anthropic Prompt Caching - Multi-Message (Effect AI) ║',
);
yield* Console.log(
'╚════════════════════════════════════════════════════════════════════════════╝',
);
yield* Console.log('');
yield* Console.log('Testing cache_control in multi-turn conversation');
yield* Console.log('');

const makePrompt = () =>
Prompt.make([
{
role: 'user' as const,
content: [
{
type: 'text' as const,
text: largeContext,
options: {
openrouter: {
cacheControl: { type: 'ephemeral' as const },
},
},
},
{
type: 'text' as const,
text: "Hello, what's your purpose?",
},
],
},
{
role: 'assistant' as const,
content: "I'm an AI assistant designed to help with various tasks.",
},
{
role: 'user' as const,
content: 'What programming languages do you know?',
},
]);

yield* Console.log('First Call (Cache Miss Expected)');
const response1 = yield* LanguageModel.generateText({
prompt: makePrompt(),
});
const cached1 = response1.usage.cachedInputTokens ?? 0;
yield* Console.log(` Response: ${response1.text.substring(0, 80)}...`);
yield* Console.log(` cached_tokens=${cached1}`);

yield* Effect.sleep('1 second');

yield* Console.log('\nSecond Call (Cache Hit Expected)');
const response2 = yield* LanguageModel.generateText({
prompt: makePrompt(),
});
const cached2 = response2.usage.cachedInputTokens ?? 0;
yield* Console.log(` Response: ${response2.text.substring(0, 80)}...`);
yield* Console.log(` cached_tokens=${cached2}`);

// Analysis
yield* Console.log('\n' + '='.repeat(80));
yield* Console.log('ANALYSIS');
yield* Console.log('='.repeat(80));
yield* Console.log(`First call: cached_tokens=${cached1} (expected: 0)`);
yield* Console.log(`Second call: cached_tokens=${cached2} (expected: >0)`);

const success = cached1 === 0 && cached2 > 0;

if (success) {
yield* Console.log('\n✓ SUCCESS - Multi-message caching is working correctly');
} else {
yield* Console.log('\n✗ FAILURE - Multi-message caching is not working as expected');
}

yield* Console.log('='.repeat(80));
});

const OpenRouterClientLayer = OpenRouterClient.layer({
apiKey: Redacted.make(process.env.OPENROUTER_API_KEY!),
}).pipe(Layer.provide(FetchHttpClient.layer));

const OpenRouterModelLayer = OpenRouterLanguageModel.layer({
model: 'anthropic/claude-3.5-sonnet',
config: {
stream_options: { include_usage: true },
},
}).pipe(Layer.provide(OpenRouterClientLayer));

await program.pipe(
Effect.provide(OpenRouterModelLayer),
Effect.provide(BunContext.layer),
Effect.runPromise,
);

console.log('\n✓ Program completed successfully');
Loading