Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 4 additions & 5 deletions .github/workflows/quality-checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,10 @@ concurrency:
cancel-in-progress: true
on:
pull_request:
push:
jobs:
quality-checks:
runs-on: ubuntu-20.04
runs-on: ubuntu-latest
timeout-minutes: 10
steps:
- name: Checkout
Expand All @@ -19,11 +20,9 @@ jobs:
with:
node-version: 20.18.0
cache: 'npm'

- name: Install dependencies
run: npm install

- name: Run quality checks
run: npm run quality


165 changes: 147 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,9 @@ The AI21 API Client is a TypeScript library that provides a convenient interface
- [Examples](#examples-tldr) 🗂️
- [AI21 Official Documentation](#Documentation)
- [Chat](#Chat)
- [Maestro (Beta)](#Maestro)
- [Conversational RAG (Beta)](#Conversational-RAG)
- [Files](#Files)

- [Library](#Library)

## Environment Support

Expand All @@ -32,7 +32,7 @@ This client supports both Node.js and browser environments:
// Browser usage example
const client = new AI21({
apiKey: process.env.AI21_API_KEY, // or pass it in directly
dangerouslyAllowBrowser: true // Required for browser environments
dangerouslyAllowBrowser: true, // Required for browser environments
});
```

Expand Down Expand Up @@ -64,7 +64,6 @@ Feel free to dive in, experiment, and adapt these examples to suit your needs. W

The full documentation for the REST API can be found on [docs.ai21.com](https://docs.ai21.com/).


## Chat

To use the AI21 API Client, you'll need to have an API key. You can obtain an API key by signing up for an account on the AI21 website.
Expand All @@ -82,7 +81,9 @@ const client = new AI21({

const response = await client.chat.completions.create({
model: 'jamba-mini',
messages: [{ role: 'user', content: 'Hello, how are you? tell me a 100 line story about a cat named "Fluffy"' }],
messages: [
{ role: 'user', content: 'Hello, how are you? tell me a 100 line story about a cat named "Fluffy"' },
],
});

console.log(response);
Expand All @@ -105,38 +106,121 @@ for await (const chunk of streamResponse) {
console.log(chunk.choices[0]?.delta?.content || '');
}
```

---
### Files

## Maestro (Beta)

The `AI21` class provides a `files` property that gives you access to the Files API. You can use it to upload, retrieve, update, list, and delete files.
AI21 Maestro is an advanced AI orchestration platform that can intelligently use tools and manage complex workflows. The `AI21` class provides a `beta.maestro` property that gives you access to the Maestro API.

### Basic Usage

```typescript
import { AI21 } from 'ai21';

const client = new AI21({
apiKey: process.env.AI21_API_KEY, // or pass it in directly
apiKey: process.env.AI21_API_KEY,
});

const fileUploadResponse = await client.files.create({
file: './articles/article1.pdf',
labels: ['science', 'biology'],
path: 'virtual-path/to/science-articles',
// Create a maestro run
const run = await client.beta.maestro.runs.create({
input: 'Analyze the latest market trends in renewable energy',
});

// Get the run status
const result = await client.beta.maestro.runs.get(run.id);
console.log(result);
```

### Create and Poll (Recommended)

const file = await client.files.get(fileUploadResponse.fileId);
For convenience, you can use `createAndPoll()` which automatically waits for completion:

```typescript
const result = await client.beta.maestro.runs.createAndPoll(
{
input: 'Write a comprehensive report on AI trends in 2024',
tools: [{ type: 'web_search' }],
budget: 'medium',
},
{
timeout: 30000, // 30 seconds
interval: 2000, // Poll every 2 seconds
},
);

console.log(result.result);
```

### Advanced Features

Maestro supports various advanced configurations:

```typescript
const advancedRun = await client.beta.maestro.runs.createAndPoll({
input: 'Research sustainable energy solutions',

// Specify tools to use
tools: [
{
type: 'web_search',
urls: ['https://example.com', 'https://research.org'],
},
{
type: 'file_search',
file_ids: ['file-123', 'file-456'],
labels: ['research', 'energy'],
},
],

// Define specific requirements
requirements: [
{
name: 'Data Sources',
description: 'Include at least 3 credible sources',
isMandatory: true,
},
{
name: 'Length',
description: 'Report should be 1000-2000 words',
isMandatory: false,
},
],

// Specify models to use
models: ['jamba-large'],

// Control resource allocation
budget: 'high',

// Request additional fields in response
include: ['data_sources', 'requirements_result'],

// Set response language
response_language: 'english',
});
```

### Structured Input messages

You can also provide structured input messages:

```typescript
const structuredRun = await client.beta.maestro.runs.create({
input: [
{ role: 'system', content: 'You are a research assistant specializing in technology trends.' },
{ role: 'user', content: 'What are the emerging AI technologies in healthcare?' },
],
tools: [{ type: 'web_search' }],
});
```

---
### Conversational-RAG

## Conversational RAG (Beta)

The `AI21` class provides a `conversationalRag` property that gives you access to the Conversational RAG API. You can use it to ask questions that are answered based on the files you uploaded.


```typescript
import { AI21 } from 'ai21';

Expand All @@ -145,11 +229,56 @@ const client = new AI21({
});

const convRagResponse = await client.conversationalRag.create({
messages: [{ role: 'user', content: 'This question presumes that the answer can be found within the uploaded files.' }],
});

messages: [
{
role: 'user',
content: 'This question presumes that the answer can be found within the uploaded files.',
},
],
});
```

---

## Library

The `AI21` class provides a `library` property that gives you access to the Library Files API. This is a separate file management system designed for library-specific operations.

### Library Files Management

```typescript
import { AI21 } from 'ai21';

const client = new AI21({
apiKey: process.env.AI21_API_KEY,
});

// Upload a file to the library
const libraryFile = await client.library.files.create({
file: './documents/research-paper.pdf',
labels: ['research', 'academic'],
path: 'library/research/papers',
});

// Get a library file
const file = await client.library.files.get(libraryFile.fileId);

// List library files with filters
const files = await client.library.files.list({
labels: ['research'],
path: 'library/research',
});

// Update a library file
await client.library.files.update({
fileId: libraryFile.fileId,
labels: ['research', 'academic', 'updated'],
path: 'library/research/updated-papers',
});

// Delete a library file
await client.library.files.delete(libraryFile.fileId);
```

## Configuration

Expand Down
12 changes: 6 additions & 6 deletions examples/studio/conversational-rag/rag-engine.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ async function waitForFileProcessing(
const startTime = Date.now();

while (Date.now() - startTime < timeout) {
const file: FileResponse = await client.files.get(fileId);
const file: FileResponse = await client.library.files.get(fileId);
if (file.status !== 'PROCESSING') {
return file;
}
Expand All @@ -29,7 +29,7 @@ async function uploadGetUpdateDelete(fileInput, path) {
const client = new AI21({ apiKey: process.env.AI21_API_KEY });
try {
console.log(`Starting upload for file:`, typeof fileInput);
const uploadFileResponse: UploadFileResponse = await client.files.create({
const uploadFileResponse: UploadFileResponse = await client.library.files.create({
file: fileInput,
path: path,
});
Expand All @@ -41,20 +41,20 @@ async function uploadGetUpdateDelete(fileInput, path) {

if (file.status === 'PROCESSED') {
console.log('Starting file update...');
await client.files.update({
await client.library.files.update({
fileId: uploadFileResponse.fileId,
labels: ['test99'],
publicUrl: 'https://www.miri.com',
});
file = await client.files.get(uploadFileResponse.fileId);
file = await client.library.files.get(uploadFileResponse.fileId);
console.log('✓ File update completed');
} else {
console.log(`⚠ File processing failed with status ${file.status}`);
return; // Exit early if processing failed
}

console.log('Starting file deletion...');
await client.files.delete(uploadFileResponse.fileId);
await client.library.files.delete(uploadFileResponse.fileId);
console.log('✓ File deletion completed');

// Add buffer time between operations
Expand All @@ -67,7 +67,7 @@ async function uploadGetUpdateDelete(fileInput, path) {

async function listFiles() {
const client = new AI21({ apiKey: process.env.AI21_API_KEY });
const files = await client.files.list({ limit: 4 });
const files = await client.library.files.list({ limit: 4 });
console.log(`Listed files: ${files}`);
}

Expand Down
32 changes: 32 additions & 0 deletions examples/studio/maestro/maestro-run-create-and-poll.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
import { AI21 } from 'ai21';

const TIMEOUT = 20000;
const INTERVAL = 1500;

async function main() {
const client = new AI21({ apiKey: process.env.AI21_API_KEY });

const response = await client.beta.maestro.runs.createAndPoll(
{
input: 'Write a poem about the ocean',
requirements: [
{
name: 'length requirement',
description: 'The length of the poem should be less than 1000 characters',
},
{
name: 'rhyme requirement',
description: 'The poem should rhyme',
},
],
include: ['requirements_result'],
},
{
timeout: TIMEOUT,
interval: INTERVAL,
},
);
console.log(response);
}

main().catch(console.error);
15 changes: 15 additions & 0 deletions examples/studio/maestro/maestro-run.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
import { AI21 } from 'ai21';

async function main() {
const client = new AI21({ apiKey: process.env.AI21_API_KEY });

const { id } = await client.beta.maestro.runs.create({
input: 'Hello, how are you? tell me a short story about a wizard',
});

const response = await client.beta.maestro.runs.get(id);

console.log(response);
}

main().catch(console.error);
Loading