Library to manage LLM prompts. Supports:
- Defining prompts using a lightweight DSL
- Evaluating prompts using multiple providers (OpenAI, Replicate, Azure OpenAI etc.)
- Decoding responses as text, JSON or custom structured data
Add this line to your application's Gemfile:
gem "active_prompt"
And then execute:
bundle install
(or install using RubyGems directly using gem install active_prompt
)
Here's a simple example of a prompt that asks the user a question and then answers it:
class AnswerQuestionPrompt < ActivePrompt::Prompt
provider "openai"
model "gpt4"
input :question
system_message do
<<~PROMPT
The user will ask you a question. Your job is to answer truthfully. If you don't know the answer, you can say "I don't know".
PROMPT
end
user_message do
<<~PROMPT
I have a question: #{inputs.question}?
PROMPT
end
end
To get the output of a prompt, call #generate_output!
on the prompt class:
AnswerQuestionPrompt.generate_output!(question: "What color is the sky?") # => "The sky is blue."
TODO
TODO