Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
56 changes: 56 additions & 0 deletions docs/libraries/workflow/GLOSSARY.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
TODO

# Glossary

## Worker

A process that's running workflows.

There are usually multiple workers running at the same time.

## Workflow

A series of activies to be ran together.

The code defining a workflow only specifies what activites to be ran. There is no complex logic (e.g. database queries) running within workflows.

Workflow code can be reran multiple times to replay a workflow.

## Workflow State

Persistated data about a workflow.

## Workflow Run

An instance of a node running a workflow. If re-running a workflow, it will be replaying events.

## Workflow Event

An action that gets executed in a workflow. An event can be a:

- Activity

Events store the output from activities and are used to ensure activites are ran only once.

## Workflow Event History

List of events that have executed in this workflow. These are used in replays to verify that the workflow has not changed to an invalid state.

## Workflow Replay

After the first run of a workflow, all runs will replay the activities and compare against the event history. If an activity has already been ran successfully, the activity will be skipped in the replay and use the output from the previous run.

## Workflow Wake Condition

If a workflow is not currently running an activity, wake conditions define when the workflow should be ran again.

The available conditions are:

- **Immediately** Run immediately by the first available node
- **Deadline** Run at a given timesetamp.

## Activity

A unit of code to run within a workflow.

Activities can fail and will be retried accoriding to the retry policy of the workflow.
98 changes: 98 additions & 0 deletions docs/libraries/workflow/WORKFLOW.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
## Goals

**Primary**

- Performance
- Fast to write for
- Only depend on CockroachDB

**Secondary**

- Easy to monitor & manage via simple SQL queries
- Easier to understand than messages
- Rust-native
- Run in-process and as part of the binary to simplify architecture
- Leverage traits to reduce copies and needless ser/de
- Use native serde instead of Protobuf for simplicity (**this comes at the cost of verifiable backwards compatability with protobuf**)
- Lay foundations for OpenGB

## Use cases

- Billing cron jobs with batch
- Creating servers
- Email loops
- Creating dynamic servers
- What about dynamic server lifecycle? Is this more of an actor? This is blending between state and other stuff.
- Deploying CF workers

## Questions

- Concurrency
- Nondeterministic patches: https://docs.temporal.io/dev-guide/typescript/versioning#patching
- Do we plan to support side effects?

## Relation to existing Chirp primitives

### Messages

Workflows replace the usecase of messages for durable execution, which is almost all uses of messages.

Messages should still be used, but much less frequently. They're helpful for:

**Real-time Data Processing**

- When you have a continuous flow of data that needs to be processed in real-time or near-real-time.
- Examples include processing sensor data, social media feeds, financial market data, or clickstream data.
- Stream processing frameworks like Apache Kafka, Apache Flink, or Apache Spark Streaming are well-suited for handling high-volume, real-time data streams.

**Complex Event Processing (CEP)**

- When you need to detect and respond to patterns, correlations, or anomalies in real-time data streams.
- CEP involves analyzing and combining multiple event streams to identify meaningful patterns or trigger actions.
- Stream processing frameworks provide capabilities for defining and matching complex event patterns in real-time.

**Data Transformation and Enrichment**

- When you need to transform, enrich, or aggregate data as it arrives in real-time.
- This can involve tasks like data cleansing, normalization, joining with other data sources, or applying machine learning models.
- Stream processing allows you to process and transform data on-the-fly, enabling real-time analytics and insights.

**Continuous Data Integration**

- When you need to continuously integrate and process data from multiple sources in real-time.
- This can involve merging data streams, performing data synchronization, or updating downstream systems.
- Stream processing frameworks provide connectors and integrations with various data sources and sinks.

**Real-time Monitoring and Alerting**

- When you need to monitor data streams in real-time and trigger alerts or notifications based on predefined conditions.
- Stream processing allows you to define rules and thresholds to detect anomalies, errors, or critical events and send real-time alerts.

**High-throughput, Low-latency Processing**

- When you have a high volume of data that needs to be processed with low latency.
- Stream processing frameworks are designed to handle high-throughput data streams and provide low-latency processing capabilities.
- This is particularly useful in scenarios like fraud detection, real-time recommendations, or real-time bidding in advertising systems.

### Cross-package hooks

We currently use messages for hooking in to events from other workflows so we don't have to bake in support directly.

This is potentially error prone since it makes control flow more opaque.

TBD on if we keed this pattern.

### Workflows & operations across packages

**Child workflows**

TODO

**Operations**

TODO

## Temporal docs

https://docs.temporal.io/encyclopedia/

16 changes: 16 additions & 0 deletions lib/api-helper/build/src/error.rs
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,22 @@ pub fn handle_rejection(
)
}
}
GlobalError::Raw(err) => {
tracing::error!(?err, "internal error response");

// Replace internal errors with global errors
if std::env::var("RIVET_API_ERROR_VERBOSE")
.ok()
.map_or(false, |x| x == "1")
{
err_code!(ERROR, error = err.to_string())
} else {
err_code!(
ERROR,
error = format!("An internal error has occurred (ray_id {}).", ray_id)
)
}
}
};

// Modify request based on error
Expand Down
1 change: 1 addition & 0 deletions lib/bolt/config/src/service.rs
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,7 @@ pub enum ServiceKind {
#[serde(rename = "operation")]
Operation {},

// TODO: Rename to worker
#[serde(rename = "consumer")]
Consumer {
#[serde(default)]
Expand Down
16 changes: 13 additions & 3 deletions lib/bolt/core/src/context/service.rs
Original file line number Diff line number Diff line change
Expand Up @@ -287,6 +287,7 @@ impl ServiceContextData {

pub fn is_monolith_worker(&self) -> bool {
self.config().service.name == "monolith-worker"
|| self.config().service.name == "monolith-workflow-worker"
}

pub fn depends_on_nomad_api(&self) -> bool {
Expand Down Expand Up @@ -349,7 +350,9 @@ impl ServiceContextData {
}

pub fn depends_on_infra(&self) -> bool {
self.name() == "cluster-worker" || self.name() == "monolith-worker"
self.name() == "cluster-worker"
|| self.name() == "monolith-worker"
|| self.name() == "monolith-workflow-worker"
}

pub fn depends_on_cluster_config(&self) -> bool {
Expand Down Expand Up @@ -1121,11 +1124,18 @@ impl ServiceContextData {
let password = project_ctx.read_secret(&["crdb", "password"]).await?;
let sslmode = "verify-ca";

let uri = format!(
let url = format!(
"postgres://{}:{}@{crdb_host}/postgres?sslmode={sslmode}",
username, password,
);
env.insert("CRDB_URL".into(), uri);
env.insert("CRDB_URL".into(), url);

// TODO:
let workflow_url = format!(
"postgres://{}:{}@{crdb_host}/db_workflow?sslmode={sslmode}",
username, password,
);
env.insert("CRDB_WORKFLOW_URL".into(), workflow_url);
}

// Redis
Expand Down
5 changes: 5 additions & 0 deletions lib/chirp-workflow/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
[workspace]
members = [
"core",
"macros"
]
34 changes: 34 additions & 0 deletions lib/chirp-workflow/core/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
[package]
name = "chirp-workflow"
version = "0.1.0"
authors = ["Rivet Gaming, LLC <developer@rivet.gg>"]
edition = "2021"
license = "Apache-2.0"

[dependencies]
anyhow = "1.0.82"
async-trait = "0.1.80"
chirp-client = { path = "../../chirp/client" }
chirp-workflow-macros = { path = "../macros" }
formatted-error = { path = "../../formatted-error" }
futures-util = "0.3"
global-error = { path = "../../global-error" }
indoc = "2.0.5"
prost = "0.12.4"
prost-types = "0.12.4"
rand = "0.8.5"
rivet-cache = { path = "../../cache/build" }
rivet-connection = { path = "../../connection" }
rivet-metrics = { path = "../../metrics" }
rivet-operation = { path = "../../operation/core" }
rivet-pools = { path = "../../pools" }
rivet-runtime = { path = "../../runtime" }
rivet-util = { path = "../../util/core" }
serde = { version = "1.0.198", features = ["derive"] }
serde_json = "1.0.116"
sqlx = { version = "0.7.4", features = ["runtime-tokio", "postgres", "uuid", "ipnetwork"] }
thiserror = "1.0.59"
tokio = { version = "1.37.0", features = ["full"] }
tracing = "0.1.40"
tracing-subscriber = { version = "0.3.18", features = ["env-filter"] }
uuid = { version = "1.8.0", features = ["v4", "serde"] }
21 changes: 21 additions & 0 deletions lib/chirp-workflow/core/src/activity.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
use std::{fmt::Debug, hash::Hash};

use anyhow::*;
use async_trait::async_trait;
use serde::{de::DeserializeOwned, Serialize};

use crate::ActivityCtx;

#[async_trait]
pub trait Activity {
type Input: ActivityInput;
type Output: Serialize + DeserializeOwned + Debug + Send;

fn name() -> &'static str;

async fn run(ctx: &mut ActivityCtx, input: &Self::Input) -> Result<Self::Output>;
}

pub trait ActivityInput: Serialize + DeserializeOwned + Debug + Hash + Send {
type Activity: Activity;
}
Loading