Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,6 @@ spec:
value: <REPLACE_WITH_YOUR_KEY>
- name: model
value: gpt-4-turbo
- name: cacheTTL
value: 10m
```

## Connect the conversation client
Expand Down Expand Up @@ -114,12 +112,12 @@ func main() {
}

input := dapr.ConversationInput{
Message: "Please write a witty haiku about the Dapr distributed programming framework at dapr.io",
// Role: nil, // Optional
// ScrubPII: nil, // Optional
Content: "Please write a witty haiku about the Dapr distributed programming framework at dapr.io",
// Role: "", // Optional
// ScrubPII: false, // Optional
}

fmt.Printf("conversation input: %s\n", input.Message)
fmt.Printf("conversation input: %s\n", input.Content)

var conversationComponent = "echo"

Expand Down Expand Up @@ -163,7 +161,7 @@ async fn main() -> Result<(), Box<dyn std::error::Error>> {
let request =
ConversationRequestBuilder::new(conversation_component, vec![input.clone()]).build();

println!("conversation input: {:?}", input.message);
println!("conversation input: {:?}", input.content);

let response = client.converse_alpha1(request).await?;

Expand Down Expand Up @@ -224,6 +222,16 @@ dapr run --app-id=conversation --resources-path ./config --dapr-grpc-port 3500 -

{{< /tabs >}}

## Advanced features

The conversation API supports the following features:

1. **Prompt caching:** Allows developers to cache prompts in Dapr, leading to much faster response times and reducing costs on egress and on inserting the prompt into the LLM provider's cache.

1. **PII scrubbing:** Allows for the obfuscation of data going in and out of the LLM.

To learn how to enable these features, see the [conversation API reference guide]({{< ref conversation_api.md >}}).

## Related links

Try out the conversation API using the full examples provided in the supported SDK repos.
Expand Down
48 changes: 21 additions & 27 deletions daprdocs/content/en/reference/api/conversation_api.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,40 +30,34 @@ POST http://localhost:<daprPort>/v1.0-alpha1/conversation/<llm-name>/converse

| Field | Description |
| --------- | ----------- |
| `conversationContext` | The ID of an existing chat room (like in ChatGPT). |
| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. |
| `metadata` | [Metadata](#metadata) passed to conversation components. |
| `inputs` | Inputs for the conversation. Multiple inputs at one time are supported. Required |
| `cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. Optional |
| `scrubPII` | A boolean value to enable obfuscation of sensitive information returning from the LLM. Optional |
| `temperature` | A float value to control the temperature of the model. Used to optimize for consistency and creativity. Optional |
| `metadata` | [Metadata](#metadata) passed to conversation components. Optional |

#### Metadata
#### Input body

Metadata can be sent in the request’s URL. It must be prefixed with `metadata.`, as shown in the table below.

| Parameter | Description |
| Field | Description |
| --------- | ----------- |
| `metadata.key` | The API key for the component. `key` is not applicable to the [AWS Bedrock component]({{< ref "aws-bedrock.md#authenticating-aws" >}}). |
| `metadata.model` | The Large Language Model you're using. Value depends on which conversation component you're using. `model` is not applicable to the [DeepSeek component]({{< ref deepseek.md >}}). |
| `metadata.cacheTTL` | A time-to-live value for a prompt cache to expire. Uses Golang duration format. |

For example, to call for [Anthropic]({{< ref anthropic.md >}}):

```bash
curl POST http://localhost:3500/v1.0-alpha1/conversation/anthropic/converse?metadata.key=key1&metadata.model=claude-3-5-sonnet-20240620&metadata.cacheTTL=10m
```

{{% alert title="Note" color="primary" %}}
The metadata parameters available depend on the conversation component you use. [See all the supported components for the conversation API.]({{< ref supported-conversation >}})
{{% /alert %}}
| `content` | The message content to send to the LLM. Required |
| `role` | The role for the LLM to assume. Possible values: 'user', 'tool', 'assistant' |
| `scrubPII` | A boolean value to enable obfuscation of sensitive information present in the content field. Optional |

### Request content
### Request content example

```json
REQUEST = {
"inputs": ["what is Dapr", "Why use Dapr"],
"metadata": {
"model": "model-type-based-on-component-used",
"key": "authKey",
"cacheTTL": "10m",
}
"inputs": [
{
"content": "What is Dapr?",
"role": "user", // Optional
"scrubPII": "true", // Optional. Will obfuscate any sensitive information found in the content field
},
],
"cacheTTL": "10m", // Optional
"scrubPII": "true", // Optional. Will obfuscate any sensitive information returning from the LLM
"temperature": 0.5 // Optional. Optimizes for consistency (0) or creativity (1)
}
```

Expand Down
Loading