Skip to content

phospho-app/phospho

Repository files navigation

phospho: Text Analytics Platform for LLM Apps

phospho logo phospho npm package phospho Python package on PyPi Y Combinator W24

Phospho is the text analytics platform for LLM apps. Detect issues and extract insights from text messages of your users or your app. Gather user feedback and measure success. Iterate on your app to create the best conversational experience for your users.

Ship your LLM app in production with confidence, and iterate on it with insights from your users.

Learn more in the full documentation.

Key Features

  • Flexible logging
  • Automatic evaluation
  • Insights extraction
  • Data visualization
  • Collaboration
phospho diagram

Demo

demo.mp4

Quickstart: Discover the phospho lab in pure python

The phospho lab is the core analytics component of phospho. The phospho lab helps you run batched evaluations and event detections on your messages.

pip install "phospho[lab]"

Follow the quickstart here.

Self deploy

This repository contains the implementation of the platform frontend, the API backend, and the insights extraction pipeline.

  • phospho-python: Python client with analytics engine
  • extractor: FastAPI analytics service wrapping the analytics engine
  • backend: FastAPI backend
  • platform: NextJS frontend
  • internal-tools: Platform management tools

Prerequisites

Ensure you have the following installed:

  • Docker
  • Docker Compose

60 seconds deploy

  1. Clone the repo:
git clone git@github.com:phospho-app/phospho.git && cd phospho
  1. Register to the core external services:
  • OpenAI if you do not want to use Ollama
  • Cohere optional (the free developer tier is enough for testing purposes)
  • Propelauth (the free tier is enough for testing purposes)
  1. Create a copy of the .env.example file as a .env.docker files
cp .env.example .env.docker
  1. Complete the .env.docker file with secret variables
nano .env.docker # or emacs or vscode or vim

By default, phospho analytics pipeline use OpenAI as their main LLM provider.

To use Ollama, set OVERRIDE_WITH_OLLAMA_MODEL=mistral (or any model) in .env.docker. In this case, theOPENAI_API_KEY variable is not used. You do need to have an Ollama instance set up and running.

  1. Launch the project
docker-compose up
  1. Start using phospho

Go the platform at http://localhost:3000 to grab your project id and api key. Log your first message :

export PHOSPHO_PROJECT_ID="your_project_id"
export PHOSPHO_API_KEY="your_api_key"
curl -X POST "http://localhost:80/v2/log/$PHOSPHO_PROJECT_ID" -H "Authorization: Bearer $PHOSPHO_API_KEY" -H "Content-Type: application/json" -d '{"batched_log_events": [{"input": "Hi, I just logged my first task to phospho!","output": "Congrats! Keep pushing!"}]}'

Don't forget to specify your backend url when you use the client libraries in your app. By default it's http://localhost:80

Access the hosted version

To manage the phospho lab evaluations on a collaborative platform, the easiest way is to register to the hosted version.

  1. Create a phospho account
  2. Install a phospho client: pip install phospho or npm i phospho
  3. Create environment variables for PHOSPHO_API_KEY and PHOSPHO_PROJECT_ID
  4. Initialize phospho: phospho.init()
  5. Log to phospho with phospho.log(input="question", output="answer")

Follow this guide to get started.

Licence

This project is licensed under the Apache 2.0 License - see the LICENSE file for details

Related projects