Open-WebUI-Functions is a collection of Python-based functions designed to extend the capabilities of Open WebUI with additional pipelines, filters, and integrations. These functions allow users to interact with various AI models, process data efficiently, and customize the Open WebUI experience.
-
π§© Custom Pipelines: Extend Open WebUI with AI processing pipelines, including model inference and data transformations.
-
π Filters for Data Processing: Apply custom filtering logic to refine, manipulate, or preprocess input and output data.
-
π€ Azure AI Support: Seamlessly connect Open WebUI with Azure OpenAI and other Azure AI models.
-
π€ N8N Workflow Integration: Enable interactions with N8N for automation.
-
π± Flexible Configuration: Use environment variables to adjust function settings dynamically.
-
π Streaming and Non-Streaming Support: Handle both real-time and batch processing efficiently.
-
π‘οΈ Secure API Key Management: Automatic encryption of sensitive information like API keys.
Important
To use these functions, ensure the following requirements are met:
- An Active Open WebUI Instance: You must have Open WebUI installed and running.
- Required AI Services (if applicable): Some pipelines require external AI services, such as Azure AI.
- Admin Access: To install functions in Open WebUI, you must have administrator privileges.
Tip
Follow these steps to install and configure functions in Open WebUI:
- Ensure Admin Access:
Note
You must be an admin in Open WebUI to install functions.
-
Access Admin Settings:
- Navigate to the Admin Settings section in Open WebUI.
-
Go to the Function Tab:
- Open the Functions tab in the admin panel.
-
Create a New Function:
- Click Add New Function.
- Copy the function code from this repository and paste it into the function editor.
-
Set Environment Variables (if required):
- Some functions require API keys or specific configurations via environment variables.
Important
Set WEBUI_SECRET_KEY for secure encryption of sensitive API keys. This is required for the encryption features to work properly.
- Save and Activate:
- Save the function, and it will be available for use within Open WebUI.
Warning
API Key Security: Always use encryption for sensitive information like API keys!
The functions include a built-in encryption mechanism for sensitive information:
- Automatic Encryption: API keys and other sensitive data are automatically encrypted when stored.
- Encrypted Storage: Values are stored with an "encrypted:" prefix followed by the encrypted data.
- Transparent Usage: The encryption/decryption happens automatically when values are accessed.
- No Configuration Required: Works out-of-the-box when WEBUI_SECRET_KEY is set.
Important
To enable encryption, set the WEBUI_SECRET_KEY
environment variable:
# Set this in your Open WebUI environment or .env file
WEBUI_SECRET_KEY="your-secure-random-string"
Note
Pipelines are processing functions that extend Open WebUI with custom AI models, external integrations, and data manipulation logic.
Tip
Azure OpenAI Integration Made Easy
This pipeline provides seamless integration with Azure OpenAI and other Azure AI models with advanced features like Azure Search integration and multiple model support.
- Enables interaction with Azure OpenAI and other Azure AI models.
- Supports Azure Search integration for enhanced document retrieval.
- Supports multiple Azure AI models selection via the
AZURE_AI_MODEL
environment variable (e.g.gpt-4o;gpt-4o-mini
). - Customizable pipeline display with configurable prefix via
AZURE_AI_PIPELINE_PREFIX
. - Azure AI Search / RAG integration with enhanced collapsible citation display (Azure OpenAI only).
- Filters valid parameters to ensure clean requests.
- Handles both streaming and non-streaming responses.
- Provides configurable error handling and timeouts.
- Predefined models for easy access.
- Supports encryption of sensitive information like API keys.
π Azure AI Pipeline in Open WebUI
π Learn More About Azure AI
2. N8N Pipeline
- Integrates Open WebUI with N8N, an automation and workflow platform.
- Streaming support for real-time data processing.
- Sends messages from Open WebUI to an N8N webhook.
- Supports real-time message processing with dynamic field handling.
- Enables automation of AI-generated responses within an N8N workflow.
- Supports encryption of sensitive information like API keys.
- Here is an example N8N workflow for N8N Pipeline
π N8N Pipeline in Open WebUI
π Learn More About N8N
3. Infomaniak
- Integrates Open WebUI with Infomaniak, a Swiss web hosting and cloud services provider.
- Sends messages from Open WebUI to an Infomaniak AI Tool.
- Supports encryption of sensitive information like API keys.
π Infomaniak Pipeline in Open WebUI
π Learn More About Infomaniak
- Integrates Open WebUI with Google Gemini, a generative AI model by Google.
- Integration with Google Generative AI or Vertex AI API for content generation.
- Sends messages from Open WebUI to Google Gemini.
- Supports encryption of sensitive information like API keys.
- Supports both streaming and non-streaming responses (streaming automatically disabled for image generation models).
- Supports thinking and reasoning capabilities.
- Provides configurable error handling and timeouts.
- Advanced Image Processing: Optimized image handling with configurable compression, resizing, and quality settings.
- Configurable Parameters: Environment variables for image optimization (quality, max dimensions, format conversion).
- Grounding with Google search with google_search_tool.py filter
- Native tool calling support
- Configurable API version support
π Google Gemini Pipeline in Open WebUI
π Learn More About Google Gemini
Note
For LiteLLM Users: To use Google Gemini models through LiteLLM, configure LiteLLM directly in Open WebUI's Admin Panel β Settings β Connections β OpenAI section instead of using this pipeline. For more information about LiteLLM, visit the official LiteLLM GitHub repository.
Note
Filters allow for preprocessing and postprocessing of data within Open WebUI.
Note
Performance Monitoring for AI Interactions
Track response times, token usage, and optionally send analytics to Azure Log Analytics for comprehensive monitoring.
- Measures response time and token usage for AI interactions.
- Supports tracking of total token usage and per-message token counts.
- Can calculate token usage for all messages or only a subset.
- Uses OpenAI's
tiktoken
library for token counting (only accurate for OpenAI models). - Optional: Can send logs to Azure Log Analytics Workspace.
π Time Token Tracker in Open WebUI
π How to Setup Azure Log Analytics
Look here for Azure AI Integration.
Look here for N8N Integration.
Look here for Infomaniak Integration.
Look here for Google Gemini Integration.
Tip
We welcome contributions of all kinds! You don't need to write code to contribute.
For detailed instructions on how to get started with our project, see about contributing to Open-WebUI-Functions.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details. π
Note
If you have any questions, suggestions, or need assistance, please open an issue to connect with us! π€
Created by owndev - Let's make Open WebUI even more amazing together! πͺ