Skip to content

InferFlow is a declarative engine designed to simplify the creation and execution of complex pipelines. Its groundbreaking Chrono-Compute feature introduces the fourth dimension time to your workflows, transforming them from stateless scripts into stateful, context-aware systems. Define your logic in simple YAML, and let InferFlow handle the rest.

License

Notifications You must be signed in to change notification settings

cattolatte/InferFlow

Repository files navigation

I N F E R F L O W

The Time-Aware AI Pipeline Engine

Contributors Stargazers Forks


InferFlow is a powerful, declarative orchestration engine that introduces computational memory to your AI workflows. By treating time as a first-class citizen, it moves beyond simple, stateless execution to enable true, real-time meta-analysis of your data and your models' performance.

Define complex, multi-modal pipelines in simple YAML, and let InferFlow's distributed architecture handle the rest.

InferFlow Dashboard Screenshot


> Key Concepts

  • Declarative Pipelines: Define complex logic with multiple steps, conditional execution, and multi-modal inputs (text, images) in a human-readable YAML format. No more hard-coded scripts.
  • Chrono-Compute Engine: Every step's result is automatically recorded in a time-series database (InfluxDB). This gives your pipelines a "memory," allowing you to perform powerful temporal analysis.
  • Human-in-the-Loop: Don't trust a model's low-confidence result? Pipelines can automatically pause and request human verification via a dedicated review UI, creating a perfect human-AI partnership.
  • Distributed & Scalable: Built on a message queue (RabbitMQ) and a containerized worker architecture, InferFlow is designed to scale horizontally from the ground up.

> How It Works

The system is a symphony of modern backend technologies orchestrated to provide a seamless developer experience.


\+--------------------------+      +-------------------------+      +------------------------+
|   User (Defines YAML)    |----->|   NGINX Reverse Proxy   |----->|   FastAPI Orchestrator |
\+--------------------------+      +-------------------------+      +------------------------+
^                                                                |
| (WebSocket UI Updates)                                         v
|                                                    +-------------------------+
|                                                    |  RabbitMQ Message Queue |
|                                                    +-------------------------+
|                                                                |
|                                  /------------/----------------/----------------  
|                                  |            |                |                |
v                                  v            v                v                v
\+------------------------+      +----------------+ +----------------+ +----------------+ +----------------+
|      Frontend UI       |<-----|  Sentiment AI  | |  Image AI      | |  Trend Detector| |   (Your Worker)  |
| (Reacting to events)   |      |     Worker     | |     Worker     | |   Meta-Model   | |       ...        |
\+------------------------+      +----------------+ +----------------+ +----------------+ +----------------+
|                |                |
\--------------------------------/
|
v
\+---------------------+
| InfluxDB (Time-Series)|
| Redis (Job State)   |
\+---------------------+


> Tech Stack

Component Technology Purpose
Orchestration Python, FastAPI Core API, pipeline execution, and WebSocket control.
Web & Proxy NGINX High-performance reverse proxy and static file server.
Messaging RabbitMQ Asynchronous, reliable task queue for workers.
Memory InfluxDB, Redis Time-series data (Chrono-Compute) & job state.
Workers Docker, Hugging Face Transformers Containerized, scalable AI/ML models.
Frontend HTML5, CSS3, JavaScript A beautiful, responsive UI for a great DX.

> Getting Started

Prerequisites

  • Docker
  • Docker Compose

Installation & Launch

  1. Clone the repository:

    git clone https://github.com/coderstale/inferflow.git
    cd inferflow
  2. Build and run the services: This single command builds all the container images and starts the entire distributed system.

    docker compose up --build
  3. Access the application:

    • Homepage: Open your browser to http://localhost
    • Dashboard: Navigate to http://localhost/app.html

> Example Pipeline

Here is an example of a multi-modal pipeline that uses conditional logic. It classifies an image, runs sentiment analysis on the classification label, and only proceeds if the sentiment is positive.

input: "[https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/cats.png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/cats.png)"
pipeline:
  - name: "classify_image"
    model: "image_classifier"
    input: "{$ROOT.input}"

  - name: "get_sentiment_of_classification"
    model: "sentiment"
    input: "The image was classified as a {$classify_image.0.label}"

  - name: "conditional_trend_analysis"
    model: "trend_detector"
    input: "The sentiment history is {history('get_sentiment_of_classification', '5m')}"
    run_if: "$get_sentiment_of_classification.label == 'POSITIVE'"

> Credits & Contact

This project was created and developed by Satya Sai Nischal.


Copyright © 2025 Satya Sai Nischal. All Rights Reserved.

About

InferFlow is a declarative engine designed to simplify the creation and execution of complex pipelines. Its groundbreaking Chrono-Compute feature introduces the fourth dimension time to your workflows, transforming them from stateless scripts into stateful, context-aware systems. Define your logic in simple YAML, and let InferFlow handle the rest.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published