██████ ██████ ███████ ███ ██ ███████ ███ ███ ██ ████████ ██ ██ ██ ██ ██ ██ ██ ████ ██ ██ ████ ████ ██ ██ ██ ██ ██ ██ ██████ █████ ██ ██ ██ ███████ ██ ████ ██ ██ ██ ███████ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██████ ██ ███████ ██ ████ ███████ ██ ██ ██ ██ ██ ██
The open-source, local-first alternative to LangSmith.
The open-source, local-first alternative to LangSmith.
opensmith is to LangSmith what Ollama is to OpenAI — the local-first, privacy-first alternative.
| LangSmith | opensmith | |
|---|---|---|
| Setup | Cloud account required | pip install opensmith |
| Data privacy | Sends traces to cloud | 100% local, SQLite only |
| Framework | Best with LangChain | Works with any Python code |
| Cost | Free tier then paid | Free forever, open source |
| Offline | No | Yes |
| Docker | No | No |
| Dashboard | Hosted | localhost:7823 |
LangSmith is powerful, but it is built around cloud-hosted tracing and is most natural inside the LangChain ecosystem. opensmith is a local-first alternative: install it with pip, use it with any Python LLM pipeline, and inspect traces on your machine without accounts, hosted services, Docker, or configuration. No trace data leaves your machine.
pip install opensmithfrom opensmith import trace
@trace
def call_llm(prompt: str):
return openai.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}],
)
@trace
def my_pipeline(question: str):
# search_docs is your own retrieval function
docs = search_docs(question)
return call_llm(docs + question)Async functions are supported:
from opensmith import trace
@trace(tags=["production", "rag"])
async def call_llm(prompt: str):
return await openai.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}],
)from opensmith import trace
with trace("my_pipeline", tags=["debug"]) as t:
t.log("query", query)
response = openai.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": query}],
)
t.log("response", response)from opensmith import autopatch
autopatch()Patch only selected backends:
from opensmith import autopatch
autopatch(only=["openai"])Patch everything except selected backends:
from opensmith import autopatch
autopatch(exclude=["chromadb"])Print trace results to the terminal as they complete:
from opensmith import set_console_mode, trace
set_console_mode(True)
@trace
def my_func():
return "ok"opensmith reads opensmith.json from the current working directory on import:
{
"db_path": "./my_traces.db",
"console_mode": false,
"autopatch": ["openai", "qdrant"]
}opensmith uiOpen http://localhost:7823.
| Command | Description |
|---|---|
opensmith ui |
Start the local dashboard at localhost:7823. |
opensmith traces |
List recent traces in the terminal. |
opensmith stats |
Show aggregate trace, step, token, and cost statistics. |
opensmith clear |
Delete all locally stored traces after confirmation. |
| Backend | Package | Status |
|---|---|---|
| openai | openai | ✅ |
| anthropic | anthropic | ✅ |
| litellm | litellm | ✅ |
| qdrant | qdrant-client | ✅ |
| chromadb | chromadb | ✅ |
| pinecone | pinecone-client | ✅ |
Traces are stored locally at ~/.opensmith/traces.db unless overridden with opensmith.json or set_default_db_path().
MIT
