Skip to content

Commit

Permalink
refactor(ai) improve introduction wording (#570)
Browse files Browse the repository at this point in the history
* Update verbiage in introduction

* docs(ai): mention AI Video subnet

---------

Co-authored-by: Elite Encoder <john@eliteencoder.net>
  • Loading branch information
rickstaa and eliteprox committed May 20, 2024
1 parent 5a96ebe commit 32774e2
Showing 1 changed file with 27 additions and 29 deletions.
56 changes: 27 additions & 29 deletions ai/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,16 +12,15 @@ iconType: regular
is invaluable for enhancing the AI Subnet. Thank you for your contributions!
</Warning>

The new **AI Video Subnet**, henceforth referred to as the **AI Subnet**, is the
first step in bringing powerful AI video capabilities into the Livepeer network.
It enables video developers to add a rapidly growing suite of **generative AI
The **AI Video Subnet**, also known as the **AI Subnet**, is the first step
toward bringing powerful AI video capabilities into the Livepeer network. It
enables video developers to add a rapidly growing suite of **generative AI
features** such as [text-to-image](/ai/pipelines/text-to-image),
[image-to-image](/ai/pipelines/image-to-image) and
[image-to-video](/ai/pipelines/image-to-video) conversions to their
applications, and allows node operators to **earn revenue by deploying their GPU
resources** for AI processing tasks. For an in-depth understanding of the AI
Subnet and its current capabilities, continue reading. Ready to dive in? Choose
one of the cards below to kickstart your journey with the AI Subnet.
[image-to-video](/ai/pipelines/image-to-video) to their applications. Livepeer
Node operators are able to **earn revenue by deploying their GPU resources** for
AI processing tasks. Ready to dive in? Choose one of the cards below to
kickstart your journey with the AI Subnet.

## Kickstart Your Journey

Expand Down Expand Up @@ -86,20 +85,19 @@ The AI Subnet, initially proposed in
[this SPE treasury proposal](https://explorer.livepeer.org/treasury/82843445347363563575858115586375001878287509193479217286690041153234635982713),
represents a significant evolution within the Livepeer ecosystem. This
**decentralized**, **open-source** framework seamlessly integrates a variety of
**generative AI inference** tasks, such as image and video generation and
upscaling, into the existing network infrastructure. These enhancements not only
strengthen the Livepeer Mainnet's transcoding services, celebrated for their
**low cost** and **high reliability**, but also pave the way for groundbreaking
applications across both emerging Web3 environments and established Web2
sectors.
**generative image and video AI inference** tasks into the existing Livepeer
network infrastructure. These enhancements strengthen Livepeer's transcoding
services, celebrated for their **low cost** and **high reliability**, while also
paving the way for groundbreaking applications across both emerging Web3
environments and established Web2 sectors.

Designed to revolutionize creative processes, the AI Subnet reflects Livepeer's
commitment to creating a **globally accessible open video infrastructure**. This
infrastructure aims to empower users by allowing them to leverage any form of
video computing they need. By equipping video applications with advanced AI
tools and diminishing reliance on centralized computing resources, it extends
**cutting-edge AI capabilities** to a broader audience, fostering a more
equitable digital landscape.
commitment to creating a **globally accessible open video infrastructure**.
Livepeer's AI subnet aims to empower users to leverage any form of image and
video computing with AI. By equipping video applications with advanced AI tools
and reducing reliance on centralized computing resources, we are dedicated to
extending **cutting-edge AI capabilities** to a broader audience, fostering a
more equitable digital landscape.

### Advantages of Livepeer's AI Subnet

Expand All @@ -114,9 +112,9 @@ equitable digital landscape.

### How It Works

The AI Subnet, built on the established Livepeer network, leverages its
**decentralized payment infrastructure** for efficient AI inference task
execution. The network consists of two primary actors:
The AI Subnet, built on the established Livepeer network, leverages a
**decentralized payment infrastructure** for compensating AI orchestrator nodes
for performing AI inference tasks. The network consists of two primary actors:

- **AI Gateway Nodes**: These nodes manage the flow of AI tasks from
applications and users, directing them to the appropriate Orchestrator nodes
Expand All @@ -142,8 +140,8 @@ scalability of new pipelines. Ongoing developments aim to enhance
increasingly complex AI models and custom user-defined pipelines.

Below is a simplified diagram illustrating the **complete AI inference
pipeline** within the Livepeer AI Subnet. Only one AI Orchestrator
with one GPU on the same machine is shown for clarity. In reality, the AI Gateway is
pipeline** within the Livepeer AI Subnet. Only one AI Orchestrator with one GPU
on the same machine is shown for clarity. In reality, the AI Gateway is
connected to multiple Orchestrators, each of which can have multiple worker
nodes with various GPUs attached.

Expand All @@ -162,10 +160,10 @@ graph TD

This flow starts at the AI Gateway nodes, which route tasks from applications or
users to an appropriate AI Orchestrator node. The selection of the AI
Orchestrator node is based on factors such as the speed of previous
inference requests, orchestrator stake, advertised price, and more. The AI
Orchestrator node then executes the task in the ai-runner Docker Container. In
this container, the AI Orchestrator can either:
Orchestrator node is based on factors such as the speed of previous inference
requests, orchestrator stake, advertised price, and more. The AI Orchestrator
node then executes the task in the ai-runner Docker Container. In this
container, the AI Orchestrator can either:

- **Pre-load a model**: Orchestrators keep frequently used models
[warm](ai/pipelines/overview#warm-models) on GPUs, speeding up task
Expand Down

0 comments on commit 32774e2

Please sign in to comment.