Skip to content

postlang/posthog-llm-examples

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 

Repository files navigation

Posthog-LLM Examples

This repository contains examples of how to upload data to PostHog-LLM using PostHog python client library.

Quick Start

  1. Clone this repository:
git clone https://github.com/postlang/posthog-llm-examples
cd posthog-llm-examples
  1. Install the PostHog python client library:
pip install posthog
  1. Run the example script:

Export your PostHog API key as an environment variable and your PostHog instance URL as an environment variable, assuming you are running PostHog locally:

export POSTHOG_HOST='http://localhost'
export POSTHOG_API_KEY='phc_oAJEP3WDzP0vNeK41CJAOMBeH3eFlozYrgvFFWF1bT0'

and run one of the example scripts:

python3 examples/upload_dataset.py 

Capturing LLM interactions

Events are captured using the standard PostHog client. For LLM applications, a typical event captures the LLM's response as follows:

output_generated = llm_model.generate(prompt_input, ...)

posthog.capture(user_id,
                event="llm-task",
                properties={"$llm_input": prompt_input,
                            "$llm_output": output_generated
                            "$session_id": uuid.uuid4()})

The above code represents a single task. The $session_id is used to link tasks together. A task represents a single interaction with the LLM model. So, to capture a conversation, all tasks in the conversation should have the same $session_id. This is the first method to link tasks together.

Alternatively, you can incorporate conversation history in ChatML format within the $llm_input property for stateless tasks. Here's an example:

user_chatml = [
  {"role": "user", "content": "first user message"},
  {"role": "assistant", "content": "first agent message"},
  {"role": "user", "content": "second user message"}, 
]

output_generated = llm_model.generate(....)

posthog.capture(user_id,
                event="llm-task",
                properties={"$llm_input": user_chatml,
                            "$llm_output": output_generated
                            "model": "gpt-3.5-turbo",})

To send LLM events use the capture method. The required inputs are:

  • user_id: The user id of the user interacting with the LLM model.
  • event: The event name to be used for the LLM task event. You must "llm-" as the event name if want to use any of the new features provided by PostHog-LLM such as plugins and dialog visualization when doing product analysis.
  • $llm_input: the input to the LLM, either the text of the input prompt or a list of dictionaries with chat turns in chatml format, in the properties dictionary.
  • $llm_output: the output generated by the LLM model, in the properties dictionary.

Addional, optional properties can be added to the properties dictionary. For example, the model used to generate the output, the generation time, sentiment, etc.