Skip to content

2023 WebKit and Linux graphics

Manuel Rego Casasnovas edited this page Jun 8, 2023 · 2 revisions

WebKit and Linux graphics

Presentation content:

## Current state

### ANGLE

Done (dmabuf, fallback)

### GPUProcess

In progress (WebGL)
TODO: 2D painting, Media playback

### Media playback

DMABufs work (video sink)

## Plans

* WebGL, WebGPU, Media playback: GPUProcess
    * Executed on/via DMABufs&Friends
    * Then taken to the point of composition
* 2D painting: GPUProcess
    * Can be handled through DMABufs&Friends
    * Will have to be when GPU-accelerated
* Composition: UIProcess
    * Funnel for all the content
    * Composition of the gathered resources
    * Direct access to layer tree => apply scrolling, gesture where the events originate

new topic GPUProcess

Niko: (Question about GPUProcess status?)

Zan:

WebGL:
- It's a prototype for WebGL
- Angle is isolated to the GPUProcess
- WebGL calls are translated into IPC calls forwared to the GPUProcess
- DMA-Buf also will be used here

2d painting: (missed)

Media playback:

- Requires more refactoring in the media player
- We currently assume feeding the media content directly into compositing stage
- We have a prototype video sink. If the sink outputs dmabuf's we bring these to the composition step

Alex: A question for other people in the session: Should we (webkit?) move forward with the architecture of sharing buffers? Was there some concerns about this we're missing?

Youenn(?): In Apple, we're using IOSurfaces(?), likely similar to dmabuf. In the GPUProcess, we're also doing some media decoding/camera encoding, it's something you might need to keep in mind too.

Enrique: It's my understanding that the GStreamer code should live on the GPUProcess (incuding the SourceBuffer preprocessing/demuxing done on the AppendPipeline) but, in the context of MSE SourceBuffer API we'll still need some sort of communication between the MSE API living on the WebProcess (used to feed MSE buffers and to query buffering stats) and the AppendPipeline and "half of the implementation of SourceBuffer" apparently living on the GPUProcess. How is that going to be achieved?

Zan & Youenne: The SourceBuffer multiplatform code has already been split in two halves, the half living on the WebProcess and the half living on the GPUProcess, so standard multiplatform IPC is going to be used to coordinate those two halves. There shouldn't be much hassle from the WPE/GStreamer point of view. All GStreamer code would just live on the GPUProcess.

new topic UIProcess compositing

Zan explains the goals, mainly on improving the event handling, like gestures/scrolling. He mentions there will be a more complete blog post about these plans(?) in the coming weeks.

Calvaris: In the context of Multimedia, we have plenty of things to do in the roadmap, and regarding GPUProcess, it would require a large amount of work to cover everything, so we'd really need to sync the work on it.