Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,7 @@ defs:
parser: yaml
text:
- "\n${ CODE.source_code }\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
input:
- |
Here is some info about the location of the function in the repo.
Expand Down Expand Up @@ -298,7 +298,7 @@ defs:
read: ./ground_truth.txt
text:
- "\n${ CODE.source_code }\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
def: EXPLANATION
input: |
Here is some info about the location of the function in the repo.
Expand Down Expand Up @@ -380,7 +380,7 @@ defs:
TRUTH:
read: ./ground_truth.txt
text:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
def: EXPLANATION
contribute: []
input:
Expand Down
6 changes: 3 additions & 3 deletions docs/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ Hello, world!
--8<-- "./examples/tutorial/calling_llm.pdl"
```

In this program ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm.pdl)), the `text` starts with the word `"Hello\n"`, and we call a model (`replicate/ibm-granite/granite-3.0-8b-instruct`) with this as input prompt.
In this program ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm.pdl)), the `text` starts with the word `"Hello\n"`, and we call a model (`replicate/ibm-granite/granite-3.1-8b-instruct`) with this as input prompt.
The model is passed a parameter `stop_sequences`.

A PDL program computes 2 data structures. The first is a JSON corresponding to the result of the overall program, obtained by aggregating the results of each block. This is what is printed by default when we run the interpreter. The second is a conversational background context, which is a list of role/content pairs, where we implicitly keep track of roles and content for the purpose of communicating with models that support chat APIs. The contents in the latter correspond to the results of each block. The conversational background context is what is used to make calls to LLMs via LiteLLM.
Expand Down Expand Up @@ -522,7 +522,7 @@ text:
contribute: [context]
- repeat:
text:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
role: assistant
- read:
def: eval
Expand Down Expand Up @@ -553,7 +553,7 @@ The prompt that is actually submitted to the first model call (with query `What
To change the template that is applied, you can specify it as a parameter of the model call:

```yaml
model: replicate/ibm-granite/granite-3.0-8b-instruct
model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
roles:
system:
Expand Down
2 changes: 1 addition & 1 deletion examples/callback/repair_prompt.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ lastOf:
Please repair the code!

- def: raw_output
model: replicate/ibm-granite/granite-3.0-8b-instruct
model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
#stop_sequences: "\n\n"
temperature: 0
Expand Down
2 changes: 1 addition & 1 deletion examples/chatbot/chatbot.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ text:
- repeat:
text:
# Send context to Granite model hosted at replicate.com
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
# Allow the user to type 'yes', 'no', or anything else, storing
# the input into a variable named `eval`. The input is also implicitly
# added to the context.
Expand Down
6 changes: 3 additions & 3 deletions examples/cldk/cldk-assistant.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ text:
- if: ${ query != 'quit'}
then:
text:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
def: PDL
input: |
Question: What are all the classes?
Expand Down Expand Up @@ -109,7 +109,7 @@ text:
method = PDL_SESSION.cldk_state.get_method("org.ibm.App", "Foo(string)")
result = method
- "\n\nGenerate a summary of method Foo\n\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
```

Question: Generate a different comment for method Foo(string) in class org.ibm.App?
Expand All @@ -121,7 +121,7 @@ text:
method = PDL_SESSION.cldk_state.get_method("org.ibm.App", "Foo(string)")
result = method
- "\n\nGenerate a different comment for method Foo(string)\n\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
```

If the query contains something about a field be sure to call a model.
Expand Down
2 changes: 1 addition & 1 deletion examples/code/code-eval.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ text:
- "\n${ CODE.source_code }\n"
# Use replicate.com to invoke a Granite model with a prompt. Output AND
# set the variable `EXPLANATION` to the output.
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
def: EXPLANATION
input: |
Here is some info about the location of the function in the repo.
Expand Down
2 changes: 1 addition & 1 deletion examples/code/code-json.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ defs:
TRUTH:
read: ./ground_truth.txt
text:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
def: EXPLANATION
contribute: []
input:
Expand Down
2 changes: 1 addition & 1 deletion examples/code/code.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ text:
# Output the `source_code:` of the YAML to the console
- "\n${ CODE.source_code }\n"
# Use replicate.com to invoke a Granite model with a prompt
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
input: |
Here is some info about the location of the function in the repo.
repo:
Expand Down
2 changes: 1 addition & 1 deletion examples/demo/1-gen-data.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ defs:
parser: yaml
spec: { questions: [str], answers: [obj] }
text:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
def: model_output
spec: {name: str, age: int}
input:
Expand Down
2 changes: 1 addition & 1 deletion examples/demo/2-teacher.pdl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
defs:
teacher_sys_prompt: You are a very knowledgeable AI Assistant that will faithfully assist the user with their task.
teacher_model: replicate/ibm-granite/granite-3.0-8b-instruct
teacher_model: replicate/ibm-granite/granite-3.1-8b-instruct
teacher_template:
function:
sys_prompt: str
Expand Down
4 changes: 2 additions & 2 deletions examples/demo/3-weather.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ description: Using a weather API and LLM to make a small weather app
text:
- def: QUERY
text: "What is the weather in Madrid?\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
input: |
Extract the location from the question.
Question: What is the weather in London?
Expand All @@ -25,7 +25,7 @@ text:
def: WEATHER
parser: json
contribute: []
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
input: |
Explain the weather from the following JSON:
${ WEATHER }
Expand Down
4 changes: 2 additions & 2 deletions examples/demo/4-translator.pdl
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
description: PDL program
text:
- "What is APR?\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
- repeat:
text:
- read:
Expand All @@ -11,5 +11,5 @@ text:
then:
text:
- "\n\nTranslate the above to ${ language }\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
until: ${ language == 'stop' }
4 changes: 2 additions & 2 deletions examples/fibonacci/fib.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ text:
# Use IBM Granite to author a program that computes the Nth Fibonacci number,
# storing the generated program into the variable `CODE`.
- def: CODE
model: replicate/ibm-granite/granite-3.0-8b-instruct
model: replicate/ibm-granite/granite-3.1-8b-instruct
input: "Write a Python function to compute the Fibonacci sequence. Do not include a doc string.\n\n"
parameters:
# Request no randomness when generating code
Expand Down Expand Up @@ -42,5 +42,5 @@ text:

# Invoke the LLM again to explain the PDL context
- "\n\nExplain what the above code does and what the result means\n\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct

2 changes: 1 addition & 1 deletion examples/granite/multi_round_chat.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ text:

${ prompt }
# Use replicate.com to run the Granite model on the context, outputting the result
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
# Use no LLM model creativity (0 is the default)
temperature: 0
Expand Down
2 changes: 1 addition & 1 deletion examples/granite/single_round_chat.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ description: Granite Single-Round Chat
text:
# (Note that 'PROMPT' is undefined will happen if you don't invoke pdl with `-f prompt.json`)
- "${ PROMPT }\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
# Use no LLM model creativity (0 is the default)
temperature: 0
Expand Down
2 changes: 1 addition & 1 deletion examples/hello/hello-code-pdl.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,5 @@ code: |
description: Hello world
text:
- "Hello\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct

2 changes: 1 addition & 1 deletion examples/hello/hello-def-use.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ description: Hello world with variable use
text:
- "Hello\n"
# Define GEN to be the result of a Granite LLM using replicate.com
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
# "greedy" sampling tells the LLM to use the most likely token at each step
decoding_method: greedy
Expand Down
4 changes: 2 additions & 2 deletions examples/hello/hello-model-chaining.pdl
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
description: Hello world showing model chaining
text:
- "Hello\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
# "greedy" sampling tells the LLM to use the most likely token at each step
decoding_method: greedy
# Tell the LLM to stop after generating an exclamation point.
stop_sequences: '!'
def: GEN
- "\nDid you say ${ GEN }?\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
decoding_method: greedy
stop_sequences: '.'
Expand Down
2 changes: 1 addition & 1 deletion examples/hello/hello-model-input.pdl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
description: Hello world with model input
text:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
input: "Hello,"
parameters:
# Tell the LLM to stop after generating an exclamation point.
Expand Down
2 changes: 1 addition & 1 deletion examples/hello/hello-parser-json.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ defs:
parser: yaml
spec: { questions: [str], answers: [obj] }
text:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
def: model_output
spec: {name: str, age: int}
input:
Expand Down
2 changes: 1 addition & 1 deletion examples/hello/hello-parser-regex.pdl
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
description: Hello world with parser using regex
text:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
input: "Hello,"
parameters:
# Tell the LLM to stop after generating an exclamation point.
Expand Down
2 changes: 1 addition & 1 deletion examples/hello/hello-roles-array.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ text:
- role: user
content: Write a Python function that implement merge sort.
contribute: []
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
input: ${ prompt }

2 changes: 1 addition & 1 deletion examples/hello/hello-type.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ text:
return:
lastOf:
- "\nTranslate the sentence '${ sentence }' to ${ language }.\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
stop_sequences: "\n"
- call: ${ translate }
Expand Down
2 changes: 1 addition & 1 deletion examples/hello/hello.pdl
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
description: Hello world
text:
- "Hello\n"
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
12 changes: 6 additions & 6 deletions examples/notebooks/demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@
"%%pdl --reset-context\n",
"text: \n",
"- \"What is the meaning of life?\\n\"\n",
"- model: \"replicate/ibm-granite/granite-3.0-8b-instruct\""
"- model: \"replicate/ibm-granite/granite-3.1-8b-instruct\""
]
},
{
Expand Down Expand Up @@ -213,9 +213,9 @@
"%%pdl\n",
"text:\n",
"- \"\\nSay it like a poem\\n\"\n",
"- model: \"replicate/ibm-granite/granite-3.0-8b-instruct\"\n",
"- model: \"replicate/ibm-granite/granite-3.1-8b-instruct\"\n",
"- \"\\n\\nTranslate it to French\\n\"\n",
"- model: \"replicate/ibm-granite/granite-3.0-8b-instruct\""
"- model: \"replicate/ibm-granite/granite-3.1-8b-instruct\""
]
},
{
Expand Down Expand Up @@ -305,7 +305,7 @@
" read: ./ground_truth.txt\n",
"text:\n",
"- \"\\n${ code.source_code }\\n\"\n",
"- model: \"replicate/ibm-granite/granite-3.0-8b-instruct\"\n",
"- model: \"replicate/ibm-granite/granite-3.1-8b-instruct\"\n",
" def: explanation\n",
" input: |\n",
" Here is some info about the location of the function in the repo.\n",
Expand Down Expand Up @@ -393,12 +393,12 @@
"- repeat:\n",
" text:\n",
" - def: thought\n",
" model: replicate/ibm-granite/granite-3.0-8b-instruct\n",
" model: replicate/ibm-granite/granite-3.1-8b-instruct\n",
" parameters:\n",
" stop_sequences: \"Act:\"\n",
" temperature: 0\n",
" - def: rawAction\n",
" model: replicate/ibm-granite/granite-3.0-8b-instruct\n",
" model: replicate/ibm-granite/granite-3.1-8b-instruct\n",
" parameters:\n",
" stop_sequences: \"\\n\"\n",
" temperature: 0\n",
Expand Down
4 changes: 2 additions & 2 deletions examples/notebooks/notebook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
"description: Model call\n",
"text: \n",
"- \"Hello\\n\"\n",
"- model: \"replicate/ibm-granite/granite-3.0-8b-instruct\"\n",
"- model: \"replicate/ibm-granite/granite-3.1-8b-instruct\"\n",
" parameters:\n",
" stop_sequences: \"!\"\n",
" "
Expand Down Expand Up @@ -89,7 +89,7 @@
" read: ./ground_truth.txt\n",
"text:\n",
"- \"\\n${ CODE.source_code }\\n\"\n",
"- model: \"replicate/ibm-granite/granite-3.0-8b-instruct\"\n",
"- model: \"replicate/ibm-granite/granite-3.1-8b-instruct\"\n",
" def: EXPLANATION\n",
" input: |\n",
" Here is some info about the location of the function in the repo.\n",
Expand Down
2 changes: 1 addition & 1 deletion examples/notebooks/notebook_debug.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,7 @@
"description: Model call\n",
"text: \n",
"- Hello,\n",
"- model: \"replicate/ibm-granite/granite-3.0-8b-instruct\"\n",
"- model: \"replicate/ibm-granite/granite-3.1-8b-instruct\"\n",
" parameters:\n",
" stop_sequences: \"!\""
]
Expand Down
2 changes: 1 addition & 1 deletion examples/rag/rag.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,4 @@ text:

Q: ${ TEST_PROMPT }
A:
- model: replicate/ibm-granite/granite-3.0-8b-instruct
- model: replicate/ibm-granite/granite-3.1-8b-instruct
4 changes: 2 additions & 2 deletions examples/react/demo.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -52,12 +52,12 @@ text:
- repeat:
text:
- def: thought
model: replicate/ibm-granite/granite-3.0-8b-instruct
model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
stop_sequences: "Act:"
- "Act:\n"
- def: action
model: replicate/ibm-granite/granite-3.0-8b-instruct
model: replicate/ibm-granite/granite-3.1-8b-instruct
parameters:
stop_sequences: "\n"
parser: json
Expand Down
2 changes: 1 addition & 1 deletion examples/react/react_call.pdl
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@ text:
- call: ${ react }
args:
question: How many years ago was the discoverer of the Hudson River born? Keep in mind we are in 2024.
model: replicate/ibm-granite/granite-3.0-8b-instruct
model: replicate/ibm-granite/granite-3.1-8b-instruct


Loading