Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds experimental Ollama functions wrapper #3251

Merged
merged 4 commits into from
Nov 14, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/api_refs/typedoc.json
Original file line number Diff line number Diff line change
Expand Up @@ -274,6 +274,7 @@
"../../langchain/src/experimental/multimodal_embeddings/googlevertexai.ts",
"../../langchain/src/experimental/chat_models/anthropic_functions.ts",
"../../langchain/src/experimental/chat_models/bittensor.ts",
"../../langchain/src/experimental/chat_models/ollama_functions.ts",
"../../langchain/src/experimental/llms/bittensor.ts",
"../../langchain/src/experimental/hubs/makersuite/googlemakersuitehub.ts",
"../../langchain/src/experimental/chains/violation_of_expectations/index.ts",
Expand Down
45 changes: 45 additions & 0 deletions docs/core_docs/docs/integrations/chat/ollama_functions.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
---
sidebar_label: Ollama Functions
---

# Ollama Functions

LangChain offers an experimental wrapper around open source models run locally via [Ollama](https://github.com/jmorganca/ollama)
that gives it the same API as OpenAI Functions.

Note that more powerful and capable models will perform better with complex schema and/or multiple functions. The examples below
use [Mistral](https://ollama.ai/library/mistral).

## Setup

Follow [these instructions](https://github.com/jmorganca/ollama) to set up and run a local Ollama instance.

## Initialize model

You can initialize this wrapper the same way you'd initialize a standard `ChatOllama` instance:

```typescript
import { OllamaFunctions } from "langchain/experimental/chat_models/ollama_functions";

const model = new OllamaFunctions({
temperature: 0.1,
model: "mistral",
});
```

## Passing in functions

You can now pass in functions the same way as OpenAI:

import CodeBlock from "@theme/CodeBlock";
import OllamaFunctionsCalling from "@examples/models/chat/ollama_functions/function_calling.ts";

<CodeBlock language="typescript">{OllamaFunctionsCalling}</CodeBlock>

## Using for extraction

import OllamaFunctionsExtraction from "@examples/models/chat/ollama_functions/extraction.ts";

<CodeBlock language="typescript">{OllamaFunctionsExtraction}</CodeBlock>

You can see a LangSmith trace of what this looks like here: https://smith.langchain.com/public/31457ea4-71ca-4e29-a1e0-aa80e6828883/r
1 change: 1 addition & 0 deletions environment_tests/test-exports-bun/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ export * from "langchain/experimental/babyagi";
export * from "langchain/experimental/generative_agents";
export * from "langchain/experimental/plan_and_execute";
export * from "langchain/experimental/chat_models/bittensor";
export * from "langchain/experimental/chat_models/ollama_functions";
export * from "langchain/experimental/chains/violation_of_expectations";
export * from "langchain/evaluation";
export * from "langchain/runnables/remote";
1 change: 1 addition & 0 deletions environment_tests/test-exports-cf/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ export * from "langchain/experimental/babyagi";
export * from "langchain/experimental/generative_agents";
export * from "langchain/experimental/plan_and_execute";
export * from "langchain/experimental/chat_models/bittensor";
export * from "langchain/experimental/chat_models/ollama_functions";
export * from "langchain/experimental/chains/violation_of_expectations";
export * from "langchain/evaluation";
export * from "langchain/runnables/remote";
1 change: 1 addition & 0 deletions environment_tests/test-exports-cjs/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ const experimental_babyagi = require("langchain/experimental/babyagi");
const experimental_generative_agents = require("langchain/experimental/generative_agents");
const experimental_plan_and_execute = require("langchain/experimental/plan_and_execute");
const experimental_chat_models_bittensor = require("langchain/experimental/chat_models/bittensor");
const experimental_chat_models_ollama_functions = require("langchain/experimental/chat_models/ollama_functions");
const experimental_chains_violation_of_expectations = require("langchain/experimental/chains/violation_of_expectations");
const evaluation = require("langchain/evaluation");
const runnables_remote = require("langchain/runnables/remote");
1 change: 1 addition & 0 deletions environment_tests/test-exports-esbuild/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ import * as experimental_babyagi from "langchain/experimental/babyagi";
import * as experimental_generative_agents from "langchain/experimental/generative_agents";
import * as experimental_plan_and_execute from "langchain/experimental/plan_and_execute";
import * as experimental_chat_models_bittensor from "langchain/experimental/chat_models/bittensor";
import * as experimental_chat_models_ollama_functions from "langchain/experimental/chat_models/ollama_functions";
import * as experimental_chains_violation_of_expectations from "langchain/experimental/chains/violation_of_expectations";
import * as evaluation from "langchain/evaluation";
import * as runnables_remote from "langchain/runnables/remote";
1 change: 1 addition & 0 deletions environment_tests/test-exports-esm/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ import * as experimental_babyagi from "langchain/experimental/babyagi";
import * as experimental_generative_agents from "langchain/experimental/generative_agents";
import * as experimental_plan_and_execute from "langchain/experimental/plan_and_execute";
import * as experimental_chat_models_bittensor from "langchain/experimental/chat_models/bittensor";
import * as experimental_chat_models_ollama_functions from "langchain/experimental/chat_models/ollama_functions";
import * as experimental_chains_violation_of_expectations from "langchain/experimental/chains/violation_of_expectations";
import * as evaluation from "langchain/evaluation";
import * as runnables_remote from "langchain/runnables/remote";
1 change: 1 addition & 0 deletions environment_tests/test-exports-vercel/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ export * from "langchain/experimental/babyagi";
export * from "langchain/experimental/generative_agents";
export * from "langchain/experimental/plan_and_execute";
export * from "langchain/experimental/chat_models/bittensor";
export * from "langchain/experimental/chat_models/ollama_functions";
export * from "langchain/experimental/chains/violation_of_expectations";
export * from "langchain/evaluation";
export * from "langchain/runnables/remote";
1 change: 1 addition & 0 deletions environment_tests/test-exports-vite/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,7 @@ export * from "langchain/experimental/babyagi";
export * from "langchain/experimental/generative_agents";
export * from "langchain/experimental/plan_and_execute";
export * from "langchain/experimental/chat_models/bittensor";
export * from "langchain/experimental/chat_models/ollama_functions";
export * from "langchain/experimental/chains/violation_of_expectations";
export * from "langchain/evaluation";
export * from "langchain/runnables/remote";
1 change: 1 addition & 0 deletions examples/src/models/chat/anthropic_functions/extraction.ts
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@ const model = new AnthropicFunctions({
},
});

// Use a JsonOutputFunctionsParser to get the parsed JSON response directly.
const chain = await prompt.pipe(model).pipe(new JsonOutputFunctionsParser());

const response = await chain.invoke({
Expand Down
63 changes: 63 additions & 0 deletions examples/src/models/chat/ollama_functions/extraction.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";

import { OllamaFunctions } from "langchain/experimental/chat_models/ollama_functions";
import { PromptTemplate } from "langchain/prompts";
import { JsonOutputFunctionsParser } from "langchain/output_parsers";

const EXTRACTION_TEMPLATE = `Extract and save the relevant entities mentioned in the following passage together with their properties.

Passage:
{input}
`;

const prompt = PromptTemplate.fromTemplate(EXTRACTION_TEMPLATE);

// Use Zod for easier schema declaration
const schema = z.object({
people: z.array(
z.object({
name: z.string().describe("The name of a person"),
height: z.number().describe("The person's height"),
hairColor: z.optional(z.string()).describe("The person's hair color"),
})
),
});

const model = new OllamaFunctions({
temperature: 0.1,
model: "mistral",
}).bind({
functions: [
{
name: "information_extraction",
description: "Extracts the relevant information from the passage.",
parameters: {
type: "object",
properties: zodToJsonSchema(schema),
},
},
],
function_call: {
name: "information_extraction",
},
});

// Use a JsonOutputFunctionsParser to get the parsed JSON response directly.
const chain = await prompt.pipe(model).pipe(new JsonOutputFunctionsParser());

const response = await chain.invoke({
input:
"Alex is 5 feet tall. Claudia is 1 foot taller than Alex and jumps higher than him. Claudia has orange hair and Alex is blonde.",
});

console.log(response);

/*
{
people: [
{ name: 'Alex', height: 5, hairColor: 'blonde' },
{ name: 'Claudia', height: 6, hairColor: 'orange' }
]
}
*/
49 changes: 49 additions & 0 deletions examples/src/models/chat/ollama_functions/function_calling.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
import { OllamaFunctions } from "langchain/experimental/chat_models/ollama_functions";
import { HumanMessage } from "langchain/schema";

const model = new OllamaFunctions({
temperature: 0.1,
model: "mistral",
}).bind({
functions: [
{
name: "get_current_weather",
description: "Get the current weather in a given location",
parameters: {
type: "object",
properties: {
location: {
type: "string",
description: "The city and state, e.g. San Francisco, CA",
},
unit: { type: "string", enum: ["celsius", "fahrenheit"] },
},
required: ["location"],
},
},
],
// You can set the `function_call` arg to force the model to use a function
function_call: {
name: "get_current_weather",
},
});

const response = await model.invoke([
new HumanMessage({
content: "What's the weather in Boston?",
}),
]);

console.log(response);

/*
AIMessage {
content: '',
additional_kwargs: {
function_call: {
name: 'get_current_weather',
arguments: '{"location":"Boston, MA","unit":"fahrenheit"}'
}
}
}
*/
3 changes: 3 additions & 0 deletions langchain/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -766,6 +766,9 @@ experimental/chat_models/anthropic_functions.d.ts
experimental/chat_models/bittensor.cjs
experimental/chat_models/bittensor.js
experimental/chat_models/bittensor.d.ts
experimental/chat_models/ollama_functions.cjs
experimental/chat_models/ollama_functions.js
experimental/chat_models/ollama_functions.d.ts
experimental/llms/bittensor.cjs
experimental/llms/bittensor.js
experimental/llms/bittensor.d.ts
Expand Down
1 change: 1 addition & 0 deletions langchain/experimental/tools/pyinterpreter.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
module.exports = require('../../dist/experimental/tools/pyinterpreter.cjs');
1 change: 1 addition & 0 deletions langchain/experimental/tools/pyinterpreter.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export * from '../../dist/experimental/tools/pyinterpreter.js'
1 change: 1 addition & 0 deletions langchain/experimental/tools/pyinterpreter.js
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export * from '../../dist/experimental/tools/pyinterpreter.js'
8 changes: 8 additions & 0 deletions langchain/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -778,6 +778,9 @@
"experimental/chat_models/bittensor.cjs",
"experimental/chat_models/bittensor.js",
"experimental/chat_models/bittensor.d.ts",
"experimental/chat_models/ollama_functions.cjs",
"experimental/chat_models/ollama_functions.js",
"experimental/chat_models/ollama_functions.d.ts",
"experimental/llms/bittensor.cjs",
"experimental/llms/bittensor.js",
"experimental/llms/bittensor.d.ts",
Expand Down Expand Up @@ -2675,6 +2678,11 @@
"import": "./experimental/chat_models/bittensor.js",
"require": "./experimental/chat_models/bittensor.cjs"
},
"./experimental/chat_models/ollama_functions": {
"types": "./experimental/chat_models/ollama_functions.d.ts",
"import": "./experimental/chat_models/ollama_functions.js",
"require": "./experimental/chat_models/ollama_functions.cjs"
},
"./experimental/llms/bittensor": {
"types": "./experimental/llms/bittensor.d.ts",
"import": "./experimental/llms/bittensor.js",
Expand Down
2 changes: 2 additions & 0 deletions langchain/scripts/create-entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -302,6 +302,8 @@ const entrypoints = {
"experimental/chat_models/anthropic_functions":
"experimental/chat_models/anthropic_functions",
"experimental/chat_models/bittensor": "experimental/chat_models/bittensor",
"experimental/chat_models/ollama_functions":
"experimental/chat_models/ollama_functions",
"experimental/llms/bittensor": "experimental/llms/bittensor",
"experimental/hubs/makersuite/googlemakersuitehub":
"experimental/hubs/makersuite/googlemakersuitehub",
Expand Down
Loading