-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added documentation for few shot prompting #3122
Merged
Merged
Changes from 7 commits
Commits
Show all changes
10 commits
Select commit
Hold shift + click to select a range
27dd9cf
Added documentation for few shot prompting
bracesproul cb8391d
added non chat model docs
bracesproul 4076f9e
chore: lint files
bracesproul eedb9fd
docs nit
bracesproul 018c516
add distinction between chat and non chat
bracesproul fca3a9f
Merge branch 'main' into brace/few-shot-prompt-docs
bracesproul 5269349
use fromTemplate
bracesproul 1abd42f
updated example
bracesproul c38599c
Merge branch 'main' into brace/few-shot-prompt-docs
bracesproul 42c08c8
fix docs location
bracesproul File filter
Filter by extension
Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
There are no files selected for viewing
280 changes: 280 additions & 0 deletions
280
docs/docs/modules/model_io/prompts/prompt_templates/few_shot.mdx
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,280 @@ | ||
# Few Shot Prompt Templates | ||
|
||
Few shot prompting is a prompting technique which provides the Large Language Model (LLM) with a list of examples, and then asks the LLM to generate some text following the lead of the examples provided. | ||
|
||
An example of this is the following: | ||
|
||
Say you want your LLM to respond in a specific format. You can few shot prompt the LLM with a list of question answer pairs so it knows what format to respond in. | ||
|
||
```txt | ||
Respond to the users question in the with the following format: | ||
|
||
Question: What is your name? | ||
Answer: My name is John. | ||
|
||
Question: What is your age? | ||
Answer: I am 25 years old. | ||
|
||
Question: What is your favorite color? | ||
Answer: | ||
``` | ||
|
||
Here we left the last `Answer:` undefined so the LLM can fill it in. The LLM will then generate the following: | ||
|
||
```txt | ||
Answer: I don't have a favorite color; I don't have preferences. | ||
``` | ||
|
||
### Use Case | ||
|
||
In the following example we're few shotting the LLM to rephrase questions into more general queries. | ||
|
||
We provide two sets of examples with specific questions, and rephrased general questions. The `FewShotChatMessagePromptTemplate` will use our examples and when `.format` is called, we'll see those examples formatted into a string we can pass to the LLM. | ||
|
||
```typescript | ||
import { | ||
ChatPromptTemplate, | ||
FewShotChatMessagePromptTemplate, | ||
} from "langchain/prompts"; | ||
``` | ||
|
||
```typescript | ||
const examples = [ | ||
{ | ||
input: "Could the members of The Police perform lawful arrests?", | ||
output: "what can the members of The Police do?", | ||
}, | ||
{ | ||
input: "Jan Sindel's was born in what country?", | ||
output: "what is Jan Sindel's personal history?", | ||
}, | ||
]; | ||
const examplePrompt = ChatPromptTemplate.fromTemplate(`Human: {input} | ||
AI: {output}`); | ||
const fewShotPrompt = new FewShotChatMessagePromptTemplate({ | ||
examplePrompt, | ||
examples, | ||
inputVariables: [], // no input variables | ||
}); | ||
``` | ||
|
||
```typescript | ||
const formattedPrompt = await fewShotPrompt.format({}); | ||
console.log(formattedPrompt); | ||
``` | ||
|
||
```typescript | ||
[ | ||
HumanMessage { | ||
lc_namespace: [ 'langchain', 'schema' ], | ||
content: 'Human: Could the members of The Police perform lawful arrests?\n' + | ||
'AI: what can the members of The Police do?', | ||
additional_kwargs: {} | ||
}, | ||
HumanMessage { | ||
lc_namespace: [ 'langchain', 'schema' ], | ||
content: "Human: Jan Sindel's was born in what country?\n" + | ||
"AI: what is Jan Sindel's personal history?", | ||
additional_kwargs: {} | ||
} | ||
] | ||
``` | ||
|
||
Then, if we use this with another question, the LLM will rephrase the question how we want. | ||
|
||
```typescript | ||
import { ChatOpenAI } from "langchain/chat_models/openai"; | ||
``` | ||
|
||
```typescript | ||
const model = new ChatOpenAI({}); | ||
|
||
const prompt = | ||
ChatPromptTemplate.fromTemplate(`Rephrase the users query to be more general, using the following examples: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We should use There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, good call |
||
{few_shot_examples} | ||
User query: {input}`); | ||
|
||
const response = await prompt.pipe(model).invoke({ | ||
input: "What's France's main city?", | ||
few_shot_examples: formattedPrompt, | ||
}); | ||
|
||
console.log(response); | ||
``` | ||
|
||
```typescript | ||
AIMessage { | ||
lc_namespace: [ 'langchain', 'schema' ], | ||
content: 'What is the capital of France?', | ||
additional_kwargs: { function_call: undefined } | ||
} | ||
``` | ||
|
||
### Few Shotting With Functions | ||
|
||
You can also partial with a function. The use case for this is when you have a variable you know that you always want to fetch in a common way. A prime example of this is with date or time. Imagine you have a prompt which you always want to have the current date. You can't hard code it in the prompt, and passing it along with the other input variables can be tedious. In this case, it's very handy to be able to partial the prompt with a function that always returns the current date. | ||
|
||
```typescript | ||
const getCurrentDate = () => { | ||
return new Date().toISOString(); | ||
}; | ||
|
||
const prompt = new FewShotChatMessagePromptTemplate({ | ||
template: "Tell me a {adjective} joke about the day {date}", | ||
inputVariables: ["adjective", "date"], | ||
}); | ||
|
||
const partialPrompt = await prompt.partial({ | ||
date: getCurrentDate, | ||
}); | ||
|
||
const formattedPrompt = await partialPrompt.format({ | ||
adjective: "funny", | ||
}); | ||
|
||
console.log(formattedPrompt); | ||
|
||
// Tell me a funny joke about the day 2023-07-13T00:54:59.287Z | ||
``` | ||
|
||
### Few Shot vs Chat Few Shot | ||
|
||
The chat and non chat few shot prompt templates act in a similar way. The below example will demonstrate using chat and non chat, and the differences with their outputs. | ||
|
||
```typescript | ||
import { | ||
FewShotPromptTemplate, | ||
FewShotChatMessagePromptTemplate, | ||
} from "langchain/prompts"; | ||
``` | ||
|
||
```typescript | ||
const examples = [ | ||
{ | ||
input: "Could the members of The Police perform lawful arrests?", | ||
output: "what can the members of The Police do?", | ||
}, | ||
{ | ||
input: "Jan Sindel's was born in what country?", | ||
output: "what is Jan Sindel's personal history?", | ||
}, | ||
]; | ||
const prompt = `Human: {input} | ||
AI: {output}`; | ||
const examplePromptTemplate = PromptTemplate.fromTemplate(prompt); | ||
const exampleChatPromptTemplate = ChatPromptTemplate.fromTemplate(prompt); | ||
const chatFewShotPrompt = new FewShotChatMessagePromptTemplate({ | ||
examplePrompt: exampleChatPromptTemplate, | ||
examples, | ||
inputVariables: [], // no input variables | ||
}); | ||
const fewShotPrompt = new FewShotPromptTemplate({ | ||
examplePrompt: examplePromptTemplate, | ||
examples, | ||
inputVariables: [], // no input variables | ||
}); | ||
``` | ||
|
||
```typescript | ||
console.log("Chat Few Shot: ", await chatFewShotPrompt.formatMessages({})); | ||
/** | ||
Chat Few Shot: [ | ||
HumanMessage { | ||
lc_namespace: [ 'langchain', 'schema' ], | ||
content: 'Human: Could the members of The Police perform lawful arrests?\n' + | ||
'AI: what can the members of The Police do?', | ||
additional_kwargs: {} | ||
}, | ||
HumanMessage { | ||
lc_namespace: [ 'langchain', 'schema' ], | ||
content: "Human: Jan Sindel's was born in what country?\n" + | ||
"AI: what is Jan Sindel's personal history?", | ||
additional_kwargs: {} | ||
} | ||
] | ||
*/ | ||
``` | ||
|
||
```typescript | ||
console.log("Few Shot: ", await fewShotPrompt.formatPromptValue({})); | ||
/** | ||
Few Shot: | ||
|
||
Human: Could the members of The Police perform lawful arrests? | ||
AI: what can the members of The Police do? | ||
|
||
Human: Jan Sindel's was born in what country? | ||
AI: what is Jan Sindel's personal history? | ||
*/ | ||
``` | ||
|
||
Here we can see the main distinctions between `FewShotChatMessagePromptTemplate` and `FewShotPromptTemplate`: input and output values. | ||
|
||
`FewShotChatMessagePromptTemplate` works by taking in a list of `ChatPromptTemplate` for examples, and its output is a list of instances of `BaseMessage`. | ||
|
||
On the other hand, `FewShotPromptTemplate` works by taking in a `PromptTemplate` for examples, and its output is a string. | ||
|
||
## With Non Chat Models | ||
|
||
LangChain also provides a class for few shot prompt formatting for non chat models: `FewShotPromptTemplate`. The API is largely the same, but the output is formatted differently (chat messages vs strings). | ||
|
||
### Partials With Functions | ||
|
||
```typescript | ||
import { | ||
ChatPromptTemplate, | ||
FewShotChatMessagePromptTemplate, | ||
} from "langchain/prompts"; | ||
``` | ||
|
||
```typescript | ||
const examplePrompt = PromptTemplate.fromTemplate("{foo}{bar}"); | ||
const prompt = new FewShotPromptTemplate({ | ||
prefix: "{foo}{bar}", | ||
examplePrompt, | ||
inputVariables: ["foo", "bar"], | ||
}); | ||
const partialPrompt = await prompt.partial({ | ||
foo: () => Promise.resolve("boo"), | ||
}); | ||
const formatted = await partialPrompt.format({ bar: "baz" }); | ||
console.log(formatted); | ||
``` | ||
|
||
```txt | ||
boobaz\n | ||
``` | ||
|
||
### With Functions and Example Selector | ||
|
||
```typescript | ||
import { | ||
ChatPromptTemplate, | ||
FewShotChatMessagePromptTemplate, | ||
} from "langchain/prompts"; | ||
``` | ||
|
||
```typescript | ||
const examplePrompt = PromptTemplate.fromTemplate("An example about {x}"); | ||
const exampleSelector = await LengthBasedExampleSelector.fromExamples( | ||
[{ x: "foo" }, { x: "bar" }], | ||
{ examplePrompt, maxLength: 200 } | ||
); | ||
const prompt = new FewShotPromptTemplate({ | ||
prefix: "{foo}{bar}", | ||
exampleSelector, | ||
examplePrompt, | ||
inputVariables: ["foo", "bar"], | ||
}); | ||
const partialPrompt = await prompt.partial({ | ||
foo: () => Promise.resolve("boo"), | ||
}); | ||
const formatted = await partialPrompt.format({ bar: "baz" }); | ||
console.log(formatted); | ||
``` | ||
|
||
```txt | ||
boobaz | ||
An example about foo | ||
An example about bar | ||
``` |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought the chat version of this returned human + AI message pairs?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only if you input human & ai. If you just give it human it'll only return human and viceversa