Skip to content

RFC: Loom support

Julien Viet edited this page Jul 12, 2022 · 9 revisions

Introduction

Using Loom with Vert.x

Loom considerations

There are several ways to build on top of Loom and we should explore them.

This page is about experiments with Vert.x and Loom.

Loom context experiment

This approach integrates virtual thread as a Vert.x context.

Running as a context allows to coordinate the execution of the IO event-loop and the virtual thread, otherwise we can have races, e.g a virtual thread and an IO event executing concurrently.

Design

A context implementation io.vertx.core.Context defines the execution of tasks, a context ensures that at most one task is executed at a time:

  • the event-loop context runs task on top of Netty's event-loop and satisfies naturally this condition, scalability is achieved with multiplexing tasks on the event-loop
  • a worker context uses a worker pool combined with a queue to ensure this condition, scalability is achieved with deploying several instances of a context (achieved with verticle deployment)

Context can be duplicated sometimes to model continuations (introduced in Vert.x 4 for the tracing feature), e.g when a new HTTP server request is created, the server context is duplicated in a lightweight clone of the original context. The current implementations do not change the task execution semantics which is fine for event-loop but could be a bonus for worker however this introduces races in the model.

Loom context can be seen as an hybrid of the event-loop and worker context

  • context tasks are scheduled on a single carrier thread
    • it ensure that a single task is executed at a time
    • state access is achieved by a single platform thread
  • a task can block the virtual thread using virtual thread suspend/resume
  • when a task is suspended, other tasks can be executed
    • this is similar to an event-loop context task scheduling a callback on a future
    • this is different from a worker context task that prevents this behavior (that would allow two concurrent tasks execution)

This design uses a virtual thread pool for a context, backed by a single carrier thread.

Required changes in Vert.x

The Loom context implementation requires a few changes in Vert.x

Context thread

So far Vert.x made the assumption that the execution thread of a Vert.x context extends VertxThread. This assumption cannot be valid anymore since a virtual thread extends thread. The Vert.x thread is a convenience to ensure faster thread local access from Vert.x but also from Netty (since it extends FastThreadLocalThread). This change can be achieved quite easily with the use of a ThreadLocal for virtual threads: when the current thread is a VertxThread then we are in the event-loop or worker model, otherwise a ThreadLocal storage can be used.

Future awaitability

Vert.x delivers an asynchronous result as a future, therefore a synchronous interaction to suspend the virtual thread until the future is completed is needed:

String result = await(future); // Or throw an exception

The current future implementation delivers the future result on the context thread and therefore it is not possible to signal the current thread on the context. Therefore we need to listen to a future result without the context. An existing future listener mechanism can be extended to support this.

Suspending and resuming from Vert.x APIs

Consuming Vert.x APIs from a virtual thread requires to coordinate work with the asynchronous operation. As a context executes at most one task a time, resuming the virtual thread cannot be achieved from the virtual thread itself.

Resources

Clone this wiki locally