Skip to content

bitovi/n8n-nodes-ollama-langfuse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Langfuse & Ollama Custom Node for n8n

This project enables Langfuse tracing for LLMs running via Ollama, integrated into n8n as a custom node. It logs all prompt/response interactions to Langfuse for observability and debugging.


Prerequisites

  • Node.js >=18
  • pnpm
  • Docker + Docker Compose
  • Git

Project Structure

n8n-nodes-ollama-langfuse/
├── docker-compose.yml        # Main app (n8n + custom nodes)
├── Dockerfile                # Optional build file for n8n
├── Langfuse/docker-compose.yml  # Langfuse tracing backend
├── n8n-custom/custom/        # Custom nodes code
│   ├── nodes/                # Custom nodes logic(Ollama/LmChatOllama, Ollama/handler)
│   ├── credentials/          # Custom credential definitions
│   ├── dist/                 # Compiled JS and flattened icons
│   ├── gulpfile.js           # Icon processor
│   ├── package.json          # Node definitions
└── n8n-data/                 # n8n runtime data (volume)

Step-by-Step Setup

1. Install pnpm

npm install -g pnpm

2. Install Dependencies

cd n8n-custom/custom
pnpm install

3. Build the Custom Nodes

pnpm build

This compiles your code and copies icons into dist/.


4. Start Langfuse

cd Langfuse
docker compose up -d

Access Langfuse UI at: http://localhost:3001


5. Start n8n with Custom Nodes

docker compose up --build

If you update node logic, re-run pnpm build and restart Docker.


Using the Custom Node in n8n

  1. Add Langfuse and Ollama credentials in n8n
  2. Use Prompt Template → LLM Chain → Ollama LLM
  3. Run the workflow
  4. View logs in Langfuse UI

Rebuild Tips

pnpm build
docker compose down
docker compose up --build

License

MIT

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published