Skip to content

Commit

Permalink
Merge pull request #24 from microsoft/cassie-docs
Browse files Browse the repository at this point in the history
update nav pages to include prompty spec, what is prompty and tutorials
  • Loading branch information
sethjuarez committed Jun 20, 2024
2 parents 0e607a4 + 7e3c247 commit ac781c4
Show file tree
Hide file tree
Showing 9 changed files with 273 additions and 2 deletions.
6 changes: 6 additions & 0 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion web/docs/contributing/page.mdx
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
---
title: Contributing
index: 1
index: 4
---
2 changes: 1 addition & 1 deletion web/docs/getting-started/page.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ date: 2024-06-10
tags:
- getting-started
- documentation
index: 0
index: 1
---

It is easy to get started with Prompty!
Expand Down
226 changes: 226 additions & 0 deletions web/docs/prompty-file-spec/page.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,226 @@
---
title: Prompty File Spec
authors:
- sethjuarez
- wayliums
- cassiebreviu
date: 2024-06-10
tags:
- prompty-file-spec
- documentation
index: 3
---
The Prompty yaml file spec can be found [here](https://github.com/microsoft/prompty/blob/main/Prompty.yaml). Below if you can find a brief description of each section and the attributes within it.

### Prompty description attributes:
```yaml
name:
type: string
description: Name of the Prompty
description:
type: string
description: Description of the Prompty
version:
type: string
description: Version of the Prompty
authors:
type: array
description: Authors of the Prompty
items:
type: string
tags:
type: array
description: Tags of the Prompty
items:
type: string
```
### Sample, inputs, outputs and template attributes:
```yaml
sample:
oneOf:
- type: object
description: The sample to be used in the Prompty test execution
additionalProperties: true
- type: string
description: The file to be loaded to be used in the Prompty test execution


inputs:
type: object
description: The inputs of the Prompty

outputs:
type: object
description: The outputs of the Prompty

template:
type: string
description: The template engine to be used can be specified here. This is optional.
enum: [jinja2]
default: jinja2

```
### Model attributes

```yaml
model: string
enum:
- chat
- completion
description: The API to use for the Prompty -- this has implications on how the template is processed and how the model is called.
default: chat

configuration:
oneOf:
- $ref: "#/definitions/azureOpenaiModel"
- $ref: "#/definitions/openaiModel"
- $ref: "#/definitions/maasModel"

parameters:
$ref: "#/definitions/parameters"

response:
type: string
description: This determines whether the full (raw) response or just the first response in the choice array is returned.
default: first
enum:
- first
- full

```
### Parameters for the model attribute:
```yaml
parameters:
type: object
description: Parameters to be sent to the model
additionalProperties: true
properties:
response_format:
type: object
description: >
An object specifying the format that the model must output. Compatible with
`gpt-4-1106-preview` and `gpt-3.5-turbo-1106`.
Setting to `{ "type": "json_object" }` enables JSON mode, which guarantees the
message the model generates is valid JSON.
seed:
type: integer
description: >
This feature is in Beta. If specified, our system will make a best effort to
sample deterministically, such that repeated requests with the same `seed` and
parameters should return the same result. Determinism is not guaranteed, and you
should refer to the `system_fingerprint` response parameter to monitor changes
in the backend.
max_tokens:
type: integer
description: The maximum number of [tokens](/tokenizer) that can be generated in the chat completion.

temperature:
type: number
description: What sampling temperature to use, 0 means deterministic.

tools_choice:
oneOf:
- type: string
- type: object

description: >
Controls which (if any) function is called by the model. `none` means the model
will not call a function and instead generates a message. `auto` means the model
can pick between generating a message or calling a function. Specifying a
particular function via
`{"type": "function", "function": {"name": "my_function"}}` forces the model to
call that function.
`none` is the default when no functions are present. `auto` is the default if
functions are present.
tools:
type: array
items:
type: object

frequency_penalty:
type: number
description: What sampling frequency penalty to use. 0 means no penalty.

presence_penalty:
type: number
description: What sampling presence penalty to use. 0 means no penalty.

stop:
type: array
items:
type: string
description: >
One or more sequences where the model should stop generating tokens. The model
will stop generating tokens if it generates one of the sequences. If the model
generates a sequence that is a prefix of one of the sequences, it will continue
generating tokens.
top_p:
type: number
description: >
What nucleus sampling probability to use. 1 means no nucleus sampling. 0 means
no tokens are generated.
```
### Definitions of OpenAI models

```yaml

openaiModel:
type: object
description: Model used to generate text
properties:
type:
type: string
description: Type of the model
const: openai
name:
type: string
description: Name of the model
organization:
type: string
description: Name of the organization
additionalProperties: false
```

## Definition of Azure OpenAI models

```yaml
azureOpenaiModel:
type: object
description: Model used to generate text
properties:
type:
type: string
description: Type of the model
const: azure_openai
api_version:
type: string
description: Version of the model
azure_deployment:
type: string
description: Deployment of the model
azure_endpoint:
type: string
description: Endpoint of the model
additionalProperties: false
```
### Definition of MaaS models

```yaml
maasModel:
type: object
description: Model used to generate text
properties:
type:
type: string
description: Type of the model
const: azure_serverless
azure_endpoint:
type: string
description: Endpoint of the model
additionalProperties: false
```
Binary file added web/docs/prompty-file-spec/prompty32x32.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
14 changes: 14 additions & 0 deletions web/docs/tutorials/page.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
title: Tutorials
authors:
- sethjuarez
- wayliums
- cassiebreviu
date: 2024-06-10
tags:
- tutorials
- documentation
index: 2
---

TODO
Binary file added web/docs/tutorials/prompty32x32.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
25 changes: 25 additions & 0 deletions web/docs/what-is-prompty/page.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
---
title: What is Prompty?
authors:
- sethjuarez
- wayliums
- cassiebreviu
date: 2024-06-10
tags:
- what-is-prompty
- documentation
index: 0
---

Prompty is a file format that has a supporting toolchain with VS Code and runtimes to simplify and accelerate your LLM application development.

## The Prompty File Format
Prompty is a language agnostic prompt asset for creating prompts and engineering the responses. Learn more about the format [here](../prompty-file-spec/page.mdx).

## The Prompty VS Code Extension
Run prompty files directly in VS Code. Download the [VS Code extension here](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.prompty).

## The Prompty Runtimes
To execute a Prompty file asset in code, you can use one of the supporting runtimes such as: [Promptyflow (python)](https://microsoft.github.io/promptflow/), [LangChain (python)](https://python.langchain.com/v0.1/docs/get_started/introduction) or [Semantic Kernal (csharp)](https://learn.microsoft.com/semantic-kernel/). You can right click on any Prompty file in VS Code and create the basic code implementation for each runtime.


Binary file added web/docs/what-is-prompty/prompty32x32.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit ac781c4

Please sign in to comment.