You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It is not clear if the teleprompter is just a way to retrieve a good example from your training dataset or it can also write a prompt from scratch , and in this case how .
The text was updated successfully, but these errors were encountered:
gabgen
changed the title
Training set mandatory for compiling ?
Does teleprompter need training dataset to optimize ?
Nov 9, 2023
gabgen
changed the title
Does teleprompter need training dataset to optimize ?
Is this an "Example Retriever" optimisation or it can also produce prompt from scratch like "LLM as Optimizers" ?
Nov 9, 2023
It's an example generation optimization, for a complete pipeline of steps. This is a lot richer than "LLM as Optimizers", which only optimizes a few words at the start of the prompt.
It's an example generation optimization, for a complete pipeline of steps. This is a lot richer than "LLM as Optimizers", which only optimizes a few words at the start of the prompt.
@okhat thanks for the feedback. "LLM as Optimizers" can actually write a prompt from scratch ( the number of words is tunable). This is particularly useful when you have very few examples to provide, so the problem is not to retrieve a good example, but more to provide a good prompt. In this case, would DSPy provide a solution to generate a good prompt, without relying on examples to put in the prompt? I saw you have a "finetune" teleprompter , is this something similar to what I am looking for ?
It is not clear if the teleprompter is just a way to retrieve a good example from your training dataset or it can also write a prompt from scratch , and in this case how .
The text was updated successfully, but these errors were encountered: