Releases: axflow/axflow
Releases · axflow/axflow
Add support for Gemini
import { StreamToIterable } from '@axflow/models/shared';
import { GoogleGenerateContent } from '@axflow/models/google/generate-content';
const stream = await GoogleGenerateContent.stream(
{
model: 'gemini-pro',
contents: [
{
parts: [
{
text: 'Write a two sentence story about a magic backpack',
},
],
},
],
},
{
apiKey: process.env.GOOGLE_API_KEY,
}
);
for await (const chunk of StreamToIterable(stream)) {
console.log(chunk);
}
Add support for Together.ai inference endpoints
For example, we can run Llama2 70B on Together's inference endpoint with:
import { StreamToIterable } from '@axflow/models/shared';
import { TogetherAIInference } from '@axflow/models/togetherai/inference';
const stream = await TogetherAIInference.stream(
{
model: 'togethercomputer/llama-2-70b-chat',
prompt: '[INST] Using no more than 20 words, what is the Eiffel tower? [/INST] ',
max_tokens: 250,
},
{
apiKey: process.env.TOGETHERAI_API_KEY,
}
);
for await (const chunk of StreamToIterable(stream)) {
console.log(chunk);
}
@axflow/models 0.0.23
Support openAI tools:
- Multiple tools per API call
- Support streaming partial tools with new
toolCallsAccessor
callback
Support new openAI types after dev day
OpenAI made some changes to their SDK during their 2023 dev day, Axflow now supports them:
- Renaming of
functions
totools
- Support for
json_mode
- Support for
seed
and deterministic calls
Support Cohere v3 embedding models
Add createMessage utility
@axflow/models@0.0.18 @axflow/models@0.0.18
Support reloading system messages in useChat hook
@axflow/models@0.0.17 @axflow/models@0.0.17
Support OpenAI functions in useChat hook
@axflow/models@0.0.16 @axflow/models@0.0.16
Add reload functionality for useChat
@axflow/models@0.0.15 @axflow/models@0.0.15
System message and callback for new messages
@axflow/models@0.0.14 @axflow/models@0.0.14