Skip to content

runawaydevil/awesome-langflow

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 

Repository files navigation

Awesome Langflow

Core Langflow Resources

These are the essential, official links that every Langflow developer and contributor should bookmark. They are the most authoritative sources for information, downloads, and updates.

Community & Social Channels

Engage with the Langflow community for support, to share your projects, and to stay up-to-date. The community is most active on Discord and GitHub.

Installation & Deployment

Langflow offers multiple installation and deployment paths, from a simple desktop app to scalable Kubernetes deployments.

Method Description Key Details & Command
Desktop Standalone, easy-to-upgrade application for getting started quickly. For macOS (13+) and Windows. Download from https://www.langflow.org/desktop.
Python Package (uv/pip) Install the open-source package for command-line use and environment management. Requires Python 3.10-3.13 and uv. uv pip install langflow
Docker Run Langflow in an isolated container for consistent, reproducible deployments. docker run -p 7860:7860 langflowai/langflow:latest
Kubernetes (Helm) Official Helm charts for scalable and robust deployments on a Kubernetes cluster. helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts
Google Cloud Platform A specific, official guide for deploying Langflow on Google Cloud Platform. Official step-by-step guide for deploying on GCP infrastructure. https://docs.langflow.org/deployment-gcp
Render A deployment guide for deploying Langflow on the Render cloud platform. Official guide for one-click deployment and hosting on Render. https://docs.langflow.org/deployment-render

Learning Pathways

Whether you're a beginner or an advanced user, these resources provide structured paths to master Langflow.

New? Start Here (Beginner Resources)

This curated list is the best starting point for developers new to Langflow.

Courses and Deeper Dives

Advanced Guides

Explore complex topics like multi-agent systems, function calling, and local LLMs.

RAG-Specific Resources

Guides and articles focused on building and optimizing Retrieval-Augmented Generation (RAG) pipelines.

Integrations

Langflow connects to a wide array of LLMs, vector databases, and data sources.

LLM Providers

  • OpenAI: Official bundle for OpenAI models, supporting text generation, embeddings, and agentic functions. Requires an OpenAI API key. Supports models like GPT-4. https://docs.langflow.org/bundles-openai
  • Azure OpenAI: Seamless integration for using Azure-hosted OpenAI models in RAG and agent applications. Requires Azure endpoint URL, deployment name, and an API key. https://www.youtube.com/watch?v=gAYZP-LUbwc
  • Anthropic: Dedicated bundle for using Anthropic's Claude models, including Claude 3.5 Haiku. Requires an Anthropic API key. Also supports MCP. https://docs.langflow.org/bundles-anthropic
  • AWS Bedrock: Official bundle for accessing various foundation models hosted on Amazon Bedrock. Requires specifying AWS region and using standard AWS authentication. https://docs.langflow.org/bundles-amazon
  • Google Vertex AI: Integration for using Google's models like text-bison for generation and embeddings. Auth via service account JSON or GOOGLE_APPLICATION_CREDENTIALS env var. https://docs.langflow.org/bundles-google
  • MistralAI: Dedicated bundle for accessing MistralAI models like open-mixtral-8x7b. Requires a MistralAI API key. The base URL is configurable. https://docs.langflow.org/bundles-mistralai
  • Groq: Build blazing-fast LLM applications using Groq's high-speed LPU Inference Engine. Integration is built-in and requires a Groq API key. https://docs.langflow.org/bundles-groq
  • Ollama (Local): Run local LLMs like Llama 3 and other open-source models directly with Langflow. No API key needed. Set the local host URL of the Ollama instance. https://docs.langflow.org/bundles-ollama
  • NVIDIA: Official bundle for NVIDIA NIMs, G-Assist, and leveraging local RTX GPU acceleration. Requires an NVIDIA API key. Includes components for generation, embeddings, and reranking. https://docs.langflow.org/bundles-nvidia
  • Meta (Llama): Use Meta's Llama models within Langflow by running them locally via Ollama. Integration is handled through the Ollama component by specifying the local Llama model. https://docs.langflow.org/bundles-meta

Vector Databases

Data Sources & Tools

Observability & Evaluation

Integrate these tools to trace, monitor, and evaluate your Langflow applications.

⚠️ Security Resources & Advisories

Langflow is under active development, and security is critical. Use these resources to stay informed about vulnerabilities and best practices.

Performance & Scalability

Resources for deploying Langflow in production and ensuring it can handle enterprise-level loads.

Comparisons & Ecosystem

Understand how Langflow fits into the broader ecosystem of LLM application development tools.

Case Studies & Announcements

See how Langflow is being used in the real world and keep up with major project milestones.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors