Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Showing
13 changed files
with
624 additions
and
88 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,88 @@ | ||
# See [the root-level README](../../readme.md). | ||
# AI.JSX — The AI Application Framework for Javascript | ||
|
||
[![Docs Site](https://img.shields.io/badge/Docs%20Site-docs.ai--jsx.com-orange)](https://docs.ai-jsx.com) | ||
[![Discord Follow](https://dcbadge.vercel.app/api/server/MsKAeKF8kU?style=flat)](https://discord.gg/MsKAeKF8kU) | ||
[![Twitter Follow](https://img.shields.io/twitter/follow/fixieai?style=social)](https://twitter.com/fixieai) | ||
|
||
[<img src="../docs/static/img/loom.png" alt="Image description" width="50%">](https://www.loom.com/share/c13b0c73f8d34f9f962048b39a4794f6?sid=a693f220-ccb1-4e70-913a-c2eda86dd0ce) | ||
|
||
AI.JSX is a framework for building AI applications using Javascript and [JSX](https://react.dev/learn/writing-markup-with-jsx). While AI.JSX [is not React](https://docs.ai-jsx.com/is-it-react), it's designed to look and feel very similar while also integrating seamlessly with React-based projects. With AI.JSX, you don't just use JSX to describe what your UI should look like, you also use it to describe how **Large Language Models**, such as ChatGPT, should integrate into the rest of your application. The end result is a powerful combination where _intelligence_ can be deeply embedded into the application stack. | ||
|
||
AI.JSX is designed to give you two important capabilities out of the box: | ||
|
||
1. An intuitive mechanism for orchestrating multiple LLM calls expressed as modular, re-usable components. | ||
1. The ability to seamlessly interweave UI components with your AI components. This means you can rely on the LLM to construct your UI dynamically from a set of provided React components. | ||
|
||
AI.JSX can be used to create standalone LLM applications that can be deployed anywhere Node.JS is supported, or it can be used as part of a larger React application. For an example of how to integrate AI.JSX into a React project, see the [NextJS demo package](/packages/nextjs-demo/) or [follow the tutorial](https://docs.ai-jsx.com/tutorial/part5). For an overview of all deployment architectures, see the [architecture overview](https://docs.ai-jsx.com/guides/architecture). | ||
|
||
For more details on how AI.JSX works with React in general, see our [AI+UI guide](https://docs.ai-jsx.com/guides/ai-ui). | ||
|
||
## Quickstart | ||
|
||
1. Follow the [Getting Started Guide](https://docs.ai-jsx.com/getting-started) | ||
1. Run through the [tutorial](https://docs.ai-jsx.com/category/tutorial) | ||
1. Clone our [Hello World template](https://github.com/fixie-ai/ai-jsx-template) to start hacking | ||
1. Check out the different examples in the [examples package](https://github.com/fixie-ai/ai-jsx/tree/main/packages/examples) | ||
1. If you're new to AI, read the [Guide for AI Newcomers](https://docs.ai-jsx.com/guides/brand-new) | ||
|
||
## Examples | ||
|
||
You can play with live demos on our [live demo app](https://ai-jsx-nextjs-demo.vercel.app/) ([source](../nextjs-demo/)). | ||
|
||
Here is a simple example using AI.JSX to generate an AI response to a prompt: | ||
|
||
```tsx | ||
import * as AI from 'ai-jsx'; | ||
import { ChatCompletion, UserMessage } from 'ai-jsx/core/completion'; | ||
|
||
const app = ( | ||
<ChatCompletion> | ||
<UserMessage>Write a Shakespearean sonnet about AI models.</UserMessage> | ||
</ChatCompletion> | ||
); | ||
const renderContext = AI.createRenderContext(); | ||
const response = await renderContext.render(app); | ||
console.log(response); | ||
``` | ||
|
||
Here's a more complex example that uses the built-in `<Inline>` component to progressively generate multiple fields in a JSON object: | ||
|
||
```tsx | ||
function CharacterGenerator() { | ||
const inlineCompletion = (prompt: Node) => ( | ||
<Completion stop={['"']} temperature={1.0}> | ||
{prompt} | ||
</Completion> | ||
); | ||
|
||
return ( | ||
<Inline> | ||
Generate a character profile for a fantasy role-playing game in JSON format.{'\n'} | ||
{'{'} | ||
{'\n '}"name": "{inlineCompletion}", | ||
{'\n '}"class": "{inlineCompletion}", | ||
{'\n '}"race": "{inlineCompletion}", | ||
{'\n '}"alignment": "{inlineCompletion}", | ||
{'\n '}"weapons": "{inlineCompletion}", | ||
{'\n '}"spells": "{inlineCompletion}", | ||
{'\n}'} | ||
</Inline> | ||
); | ||
} | ||
``` | ||
|
||
For a full set of examples, see [the examples package](https://github.com/fixie-ai/ai-jsx/tree/main/packages/examples). | ||
|
||
## Features | ||
|
||
- Prompt engineering through modular, reusable components | ||
- The ability to easily switch between model providers and LLM configurations (e.g., temperature) | ||
- Built in support for Tools (ReAct pattern), Document Question and Answering, Chain of Thought, and more | ||
- Ability to directly interweave LLM calls with standard UI components, including the ability for the LLM to render the UI from a set of provided components | ||
- Built-in streaming support | ||
- First-class support for NextJS and Create React App (more coming soon) | ||
- Full support for LangChainJS | ||
|
||
## Contributing | ||
|
||
We welcome contributions! See [Contributing](../docs/docs/contributing/index.md) for how to get started. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,183 @@ | ||
import AnthropicSDK from '@anthropic-ai/sdk'; | ||
import { getEnvVar } from './util.js'; | ||
import * as AI from '../index.js'; | ||
import { Node } from '../index.js'; | ||
import { ChatOrCompletionModelOrBoth } from './model.js'; | ||
import { | ||
AssistantMessage, | ||
ChatProvider, | ||
FunctionCall, | ||
FunctionResponse, | ||
ModelProps, | ||
SystemMessage, | ||
UserMessage, | ||
ModelPropsWithChildren, | ||
} from '../core/completion.js'; | ||
import { AIJSXError, ErrorCode } from '../core/errors.js'; | ||
|
||
export const anthropicClientContext = AI.createContext<AnthropicSDK>( | ||
new AnthropicSDK({ | ||
apiKey: getEnvVar('ANTHROPIC_API_KEY', false), | ||
}) | ||
); | ||
|
||
type ValidCompletionModel = never; | ||
/** | ||
* The set of valid Claude models. | ||
* | ||
* @see https://docs.anthropic.com/claude/reference/complete_post. | ||
*/ | ||
type ValidChatModel = | ||
| 'claude-1' | ||
| 'claude-1-100k' | ||
| 'claude-instant-1' | ||
| 'claude-instant-1-100k' | ||
| 'claude-1.3' | ||
| 'claude-1.3-100k' | ||
| 'claude-1.2' | ||
| 'claude-1.0' | ||
| 'claude-instant-1.1' | ||
| 'claude-instant-1.1-100k' | ||
| 'claude-instant-1.0'; | ||
|
||
type AnthropicModelChoices = ChatOrCompletionModelOrBoth<ValidChatModel, ValidCompletionModel>; | ||
|
||
/** | ||
* If you use an Anthropic model without specifying the max tokens for the completion, this value will be used as the default. | ||
*/ | ||
export const defaultMaxTokens = 1000; | ||
|
||
/** | ||
* An AI.JSX component that invokes an Anthropic Large Language Model. | ||
* @param children The children to render. | ||
* @param chatModel The chat model to use. | ||
* @param completionModel The completion model to use. | ||
* @param client The Anthropic client. | ||
*/ | ||
export function Anthropic({ | ||
children, | ||
chatModel, | ||
completionModel, | ||
client, | ||
...defaults | ||
}: { children: Node; client?: AnthropicSDK } & AnthropicModelChoices & ModelProps) { | ||
let result = children; | ||
|
||
if (client) { | ||
result = <anthropicClientContext.Provider value={client}>{children}</anthropicClientContext.Provider>; | ||
} | ||
|
||
if (chatModel) { | ||
result = ( | ||
<ChatProvider component={AnthropicChatModel} {...defaults} model={chatModel}> | ||
{result} | ||
</ChatProvider> | ||
); | ||
} | ||
|
||
// TS is correct that this should never happen, but we'll check for it anyway. | ||
// eslint-disable-next-line @typescript-eslint/no-unnecessary-condition | ||
if (completionModel) { | ||
throw new AIJSXError( | ||
'Completion models are not supported by Anthropic', | ||
ErrorCode.AnthropicDoesNotSupportCompletionModels, | ||
'user' | ||
); | ||
} | ||
|
||
return result; | ||
} | ||
|
||
interface AnthropicChatModelProps extends ModelPropsWithChildren { | ||
model: ValidChatModel; | ||
} | ||
export async function* AnthropicChatModel( | ||
props: AnthropicChatModelProps, | ||
{ render, getContext, logger }: AI.ComponentContext | ||
): AI.RenderableStream { | ||
const messageElements = await render(props.children, { | ||
stop: (e) => | ||
e.tag == SystemMessage || | ||
e.tag == UserMessage || | ||
e.tag == AssistantMessage || | ||
e.tag == FunctionCall || | ||
e.tag == FunctionResponse, | ||
}); | ||
yield AI.AppendOnlyStream; | ||
const messages = await Promise.all( | ||
messageElements.filter(AI.isElement).map(async (message) => { | ||
switch (message.tag) { | ||
case UserMessage: | ||
return `${AnthropicSDK.HUMAN_PROMPT}: ${await render(message)}`; | ||
case AssistantMessage: | ||
return `${AnthropicSDK.AI_PROMPT}: ${await render(message)}`; | ||
case SystemMessage: | ||
throw new AIJSXError( | ||
'Anthropic models do not support SystemMessage. Change your user message to instruct the model what to do.', | ||
ErrorCode.AnthropicDoesNotSupportSystemMessage, | ||
'user' | ||
); | ||
case FunctionCall: | ||
case FunctionResponse: | ||
throw new AIJSXError( | ||
'Anthropic models do not support functions.', | ||
ErrorCode.AnthropicDoesNotSupportFunctions, | ||
'user' | ||
); | ||
default: | ||
throw new AIJSXError( | ||
`ChatCompletion's prompts must be UserMessage or AssistantMessage, but this child was ${message.tag.name}`, | ||
ErrorCode.ChatCompletionUnexpectedChild, | ||
'internal' | ||
); | ||
} | ||
}) | ||
); | ||
|
||
if (!messages.length) { | ||
throw new AIJSXError( | ||
"ChatCompletion must have at least one child that's UserMessage or AssistantMessage, but no such children were found.", | ||
ErrorCode.ChatCompletionMissingChildren, | ||
'user' | ||
); | ||
} | ||
|
||
messages.push(AnthropicSDK.AI_PROMPT); | ||
|
||
const anthropic = getContext(anthropicClientContext); | ||
const anthropicCompletionRequest: AnthropicSDK.CompletionCreateParams = { | ||
prompt: messages.join('\n\n'), | ||
max_tokens_to_sample: props.maxTokens ?? defaultMaxTokens, | ||
temperature: props.temperature, | ||
model: props.model, | ||
stop_sequences: props.stop, | ||
stream: true, | ||
}; | ||
|
||
logger.debug({ anthropicCompletionRequest }, 'Calling createCompletion'); | ||
|
||
let response: Awaited<ReturnType<typeof anthropic.completions.create>>; | ||
try { | ||
response = await anthropic.completions.create(anthropicCompletionRequest); | ||
} catch (err) { | ||
if (err instanceof AnthropicSDK.APIError) { | ||
throw new AIJSXError( | ||
err.message, | ||
ErrorCode.AnthropicAPIError, | ||
'runtime', | ||
Object.fromEntries(Object.entries(err)) | ||
); | ||
} | ||
throw err; | ||
} | ||
let resultSoFar = ''; | ||
for await (const completion of response) { | ||
resultSoFar += completion.completion; | ||
logger.trace({ completion }, 'Got Anthropic stream event'); | ||
yield completion.completion; | ||
} | ||
|
||
logger.debug({ completion: resultSoFar }, 'Anthropic completion finished'); | ||
|
||
return AI.AppendOnlyStream; | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
/** | ||
* Helper for model components. This type is used to create prop types that must include at least a chatModel or a completionModel. | ||
* | ||
* @hidden | ||
*/ | ||
export type ChatOrCompletionModelOrBoth<ValidChatModel extends string, ValidCompletionModel extends string> = | ||
| { chatModel: ValidChatModel; completionModel?: ValidCompletionModel } | ||
| { chatModel?: ValidChatModel; completionModel: ValidCompletionModel }; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
92b6e0f
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Successfully deployed to the following URLs:
ai-jsx-docs – ./packages/docs
ai-jsx-docs-git-main-fixie-ai.vercel.app
docs.ai-jsx.com
ai-jsx-docs-fixie-ai.vercel.app
ai-jsx-docs.vercel.app
92b6e0f
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Successfully deployed to the following URLs:
ai-jsx-tutorial-nextjs – ./packages/tutorial-nextjs
ai-jsx-tutorial-nextjs.vercel.app
ai-jsx-tutorial-nextjs-fixie-ai.vercel.app
ai-jsx-tutorial-nextjs-git-main-fixie-ai.vercel.app
92b6e0f
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Successfully deployed to the following URLs:
ai-jsx-nextjs-demo – ./packages/nextjs-demo
ai-jsx-nextjs-demo-git-main-fixie-ai.vercel.app
ai-jsx-nextjs-demo.vercel.app
ai-jsx-nextjs-demo-fixie-ai.vercel.app