Skip to content

[RFE] Additonal messages on prompt #74

@eloycoto

Description

@eloycoto

Type: feature-request

Description:

When using complex scenarios, we need to teach the LLM how to generate the
response. This technique is a few-shot prompting, where a few messages are part
of the prompt before the human question.

Ollama implemented this using the message array, where the end user define it:

https://github.com/ollama/ollama/blob/dc6fe820512d1046f3a342e384baa64b8ce1758c/docs/api.md?plain=1#L451-L457

I think that in this case, it'll be cool to use something similar which can be appended here:

service/docs/openapi.json

Lines 509 to 595 in 5360e36

"LLMRequest": {
"properties": {
"query": {
"type": "string",
"title": "Query"
},
"conversation_id": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"title": "Conversation Id"
},
"provider": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"title": "Provider"
},
"model": {
"anyOf": [
{
"type": "string"
},
{
"type": "null"
}
],
"title": "Model"
},
"attachments": {
"anyOf": [
{
"items": {
"$ref": "#/components/schemas/Attachment"
},
"type": "array"
},
{
"type": "null"
}
],
"title": "Attachments"
}
},
"additionalProperties": false,
"type": "object",
"required": [
"query"
],
"title": "LLMRequest",
"description": "Model representing a request for the LLM (Language Model) send into OLS service.\n\nAttributes:\n query: The query string.\n conversation_id: The optional conversation ID (UUID).\n provider: The optional provider.\n model: The optional model.\n attachments: The optional attachments.\n\nExample:\n ```python\n llm_request = LLMRequest(query=\"Tell me about Kubernetes\")\n ```",
"examples": [
{
"attachments": [
{
"attachment_type": "log",
"content": "this is attachment",
"content_type": "text/plain"
},
{
"attachment_type": "configuration",
"content": "kind: Pod\n metadata:\n name: private-reg",
"content_type": "application/yaml"
},
{
"attachment_type": "configuration",
"content": "foo: bar",
"content_type": "application/yaml"
}
],
"conversation_id": "123e4567-e89b-12d3-a456-426614174000",
"model": "gpt-3.5-turbo",
"provider": "openai",
"query": "write a deployment yaml for the mongodb image"
}
]
},

Another example can be found on PDL(prompt declaration language) project:
https://github.com/IBM/prompt-declaration-language/blob/572373a09e2d105cf6712859d4be5fb371ba1051/examples/tutorial/calling_llm_with_input_messages.pdl#L5-L10

This, as far as I know:

  • Will not break any backward compatibility, it's a new parameter.
  • It might override the system prompt given in the olsconfig.yaml.
  • Can break the history message placeholder, it depends on where it's located.

Steps needed

  • Change openAPI def
  • Change history placeholder and validate the effectiveness
  • What happens with the system prompt? Override or having an option?

Questions:

  • What I should do to get it merged?
  • Is this a valid approach?

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions