Skip to content

Commit

Permalink
Release 1.3.0 (#85)
Browse files Browse the repository at this point in the history
* Pagination bug

* Bug fix

* chore: add docker cmd

* Compatibility fixes for SDK version 2.0.0 (#69)

* Pagination bug

* Bug fix

* Fix for schema changes

* Render tool calling

* Support for Langgraph, Qdrant & Groq  (#73)

* Pagination bug

* Bug fix

* Add langgraph support

* QDrant support

* Add Groq support

* update README

* update README

* feat: optimise docker image for self host setup

* adding api access to traces endpoint

* clean up

* refactor

* feat: add clickhouse db create on app start (#79)

* docs: add railway deploy, fix sdk badges (#81)

* Playground and Prompt Management (#83)

* Pagination bug

* Bug fix

* Playground - basic implementation

* Playground - streaming and nonstreaming

* Move playground inside project

* API key flow

* Api key

* Playground refactor

* Add chat hookup

* anthropic streaming support

* Bug fixes to openai playground

* Anthropic bugfixes

* Anthropic bugfix

* Cohere first iteration

* Cohere role fixes

* Cohere api fix

* Parallel running

* Playground cost calculation non streaming

* playground - streaming token calculation

* latency and cost

* Support for Groq

* Add model name

* Prompt management views

* Remove current promptset flow

* Prompt management API hooks

* Prompt registry final

* Playground bugfixes

* Bug fix playground

* Rearrange project nav

* Fix playground

* Fix prompts

* Bugfixes

* Minor fix

* Prompt versioning bugfix

* Bugfix

* fix: clickhouse table find queries (#82)

* Fix to surface multiple LLM requests inside LLM View (#84)

* Pagination bug

* Bug fix

* Fix for surfacing multiple LLM requests in LLMView

---------

Co-authored-by: Darshit Suratwala <darshit@scale3labs.com>
Co-authored-by: darshit-s3 <119623510+darshit-s3@users.noreply.github.com>
Co-authored-by: dylan <dylan@scale3labs.com>
Co-authored-by: dylanzuber-scale3 <116033320+dylanzuber-scale3@users.noreply.github.com>
  • Loading branch information
5 people committed May 7, 2024
1 parent c46453c commit 8325b6d
Show file tree
Hide file tree
Showing 56 changed files with 5,864 additions and 1,104 deletions.
14 changes: 10 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,18 @@

## Open Source & Open Telemetry(OTEL) Observability for LLM applications

![Static Badge](https://img.shields.io/badge/License-AGPL--3.0-blue) ![Static Badge](https://img.shields.io/badge/npm_@langtrase/typescript--sdk-1.2.9-green) ![Static Badge](https://img.shields.io/badge/pip_langtrace--python--sdk-1.2.8-green) ![Static Badge](https://img.shields.io/badge/Development_status-Active-green)
![Static Badge](https://img.shields.io/badge/License-AGPL--3.0-blue)
![NPM Version](https://img.shields.io/npm/v/%40langtrase%2Ftypescript-sdk?style=flat&logo=npm&label=%40langtrase%2Ftypescript-sdk&color=green&link=https%3A%2F%2Fgithub.com%2FScale3-Labs%2Flangtrace-typescript-sdk)
![PyPI - Version](https://img.shields.io/pypi/v/langtrace-python-sdk?style=flat&logo=python&label=langtrace-python-sdk&color=green&link=https%3A%2F%2Fgithub.com%2FScale3-Labs%2Flangtrace-python-sdk)
![Static Badge](https://img.shields.io/badge/Development_status-Active-green)
[![Deploy on Railway](https://railway.app/button.svg)](https://railway.app/template/yZGbfC?referralCode=MA2S9H)

---

Langtrace is an open source observability software which lets you capture, debug and analyze traces and metrics from all your applications that leverages LLM APIs, Vector Databases and LLM based Frameworks.

![image](https://github.com/Scale3-Labs/langtrace/assets/105607645/6825158c-39bb-4270-b1f9-446c36c066ee)


## Open Telemetry Support

The traces generated by Langtrace adhere to [Open Telemetry Standards(OTEL)](https://opentelemetry.io/docs/concepts/signals/traces/). We are developing [semantic conventions](https://opentelemetry.io/docs/concepts/semantic-conventions/) for the traces generated by this project. You can checkout the current definitions in [this repository](https://github.com/Scale3-Labs/langtrace-trace-attributes/tree/main/schemas). Note: This is an ongoing development and we encourage you to get involved and welcome your feedback.
Expand Down Expand Up @@ -73,6 +76,9 @@ To run the Langtrace locally, you have to run three services:
- Postgres database
- Clickhouse database

> [!IMPORTANT]
> Checkout [documentation](https://docs.langtrace.ai/hosting/overview) for various deployment options and configurations.
Requirements:

- Docker
Expand All @@ -94,7 +100,7 @@ The application will be available at `http://localhost:3000`.
> if you wish to build the docker image locally and use it, run the docker compose up command with the `--build` flag.
> [!TIP]
> to manually pull the docker image from docker hub, run the following command:
> to manually pull the docker image from [docker hub](https://hub.docker.com/r/scale3labs/langtrace-client/tags), run the following command:
>
> ```bash
> docker pull scale3labs/langtrace-client:latest
Expand Down Expand Up @@ -193,6 +199,7 @@ Either you **update the docker compose version** OR **remove the depends_on prop
If clickhouse server is not starting, it is likely that the port 8123 is already in use. You can change the port in the docker-compose file.
</details>

<br/>
Install the langtrace SDK in your application by following the same instructions under the Langtrace Cloud section above for sending traces to your self hosted setup.

---
Expand Down Expand Up @@ -228,7 +235,6 @@ Langtrace automatically captures traces from the following vendors:

![image](https://github.com/Scale3-Labs/langtrace/assets/105607645/eae180dd-ebf7-4792-b076-23f75d3734a8)


---

## Feature Requests and Issues
Expand Down
4 changes: 2 additions & 2 deletions app/(protected)/project/[project_id]/datasets/page.tsx
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import Parent from "@/components/project/dataset/parent";
import DataSet from "@/components/project/dataset/data-set";
import { authOptions } from "@/lib/auth/options";
import { Metadata } from "next";
import { getServerSession } from "next-auth";
Expand All @@ -18,7 +18,7 @@ export default async function Page() {

return (
<>
<Parent email={user.email as string} />
<DataSet email={user.email as string} />
</>
);
}

This file was deleted.

101 changes: 101 additions & 0 deletions app/(protected)/project/[project_id]/playground/page.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
"use client";

import { AddLLMChat } from "@/components/playground/common";
import LLMChat from "@/components/playground/llmchat";
import {
AnthropicModel,
AnthropicSettings,
ChatInterface,
CohereSettings,
GroqSettings,
OpenAIChatInterface,
OpenAIModel,
OpenAISettings,
} from "@/lib/types/playground_types";
import Link from "next/link";
import { useState } from "react";
import { v4 as uuidv4 } from "uuid";

export default function Page() {
const [llms, setLLMs] = useState<ChatInterface[]>([]);

const handleRemove = (id: string) => {
setLLMs((currentLLMs) => currentLLMs.filter((llm) => llm.id !== id));
};

const handleAdd = (vendor: string) => {
if (vendor === "openai") {
const settings: OpenAISettings = {
messages: [],
model: "gpt-3.5-turbo" as OpenAIModel,
};
const openaiChat: OpenAIChatInterface = {
id: uuidv4(),
vendor: "openai",
settings: settings,
};
setLLMs((currentLLMs) => [...currentLLMs, openaiChat]);
} else if (vendor === "anthropic") {
const settings: AnthropicSettings = {
messages: [],
model: "claude-3-opus-20240229" as AnthropicModel,
maxTokens: 100,
};
const anthropicChat: ChatInterface = {
id: uuidv4(),
vendor: "anthropic",
settings: settings,
};
setLLMs((currentLLMs) => [...currentLLMs, anthropicChat]);
} else if (vendor === "cohere") {
const settings: CohereSettings = {
messages: [],
model: "command-r-plus",
};
const cohereChat: ChatInterface = {
id: uuidv4(),
vendor: "cohere",
settings: settings,
};
setLLMs((currentLLMs) => [...currentLLMs, cohereChat]);
} else if (vendor === "groq") {
const settings: GroqSettings = {
messages: [],
model: "llama3-8b-8192",
};
const cohereChat: ChatInterface = {
id: uuidv4(),
vendor: "groq",
settings: settings,
};
setLLMs((currentLLMs) => [...currentLLMs, cohereChat]);
}
};

return (
<div className="px-12 py-6 flex flex-col gap-8">
<span className="text-sm font-semibold">
Note: Don't forget to add your LLM provider API keys in the{" "}
<Link href="/settings/keys" className="underline text-blue-400">
settings page.
</Link>
</span>
<div className="flex flex-row flex-wrap lg:grid lg:grid-cols-3 gap-8 w-full">
{llms.map((llm: ChatInterface) => (
<LLMChat
key={llm.id}
llm={llm}
setLLM={(updatedLLM: ChatInterface) => {
const newLLMs = llms.map((l) =>
l.id === llm.id ? updatedLLM : l
);
setLLMs(newLLMs);
}}
onRemove={() => handleRemove(llm.id)}
/>
))}
<AddLLMChat onAdd={(vendor: string) => handleAdd(vendor)} />
</div>
</div>
);
}
Loading

0 comments on commit 8325b6d

Please sign in to comment.