Skip to content
Permalink
Browse files
RemoteSampleBufferDisplayLayer should process its IPC messages from a…
… background thread

https://bugs.webkit.org/show_bug.cgi?id=216475

Reviewed by Eric Carlson.

Use a process wide work queue to process all track video renderer messages in a background thread.
This queue is shared by all RemoteSampleBufferDisplayLayerManager and RemoteSampleBufferDisplayLayer objects, which are per web process.
Since LocalSampleBufferDisplayLayer must currently be created in main thread, hop to main thread to create/delete renderers.

We update messages.py to support WantsAsyncDispatchMessage, which removes the sync handling with dispatchMessage.
This allows RemoteSampleBufferDisplayLayerManager to also handle messages from RemoteSampleBufferDisplayLayer.
Update RemoteSampleBufferDisplayLayerManager accordingly as well.

Covered by webrtc/video.html with GPU process enabled.

* GPUProcess/GPUConnectionToWebProcess.cpp:
(WebKit::GPUConnectionToWebProcess::GPUConnectionToWebProcess):
(WebKit::GPUConnectionToWebProcess::dispatchMessage):
* GPUProcess/GPUConnectionToWebProcess.h:
* GPUProcess/GPUProcess.cpp:
(WebKit::GPUProcess::videoMediaStreamTrackRendererQueue):
* GPUProcess/GPUProcess.h:
* GPUProcess/webrtc/RemoteAudioMediaStreamTrackRendererManager.cpp:
(WebKit::RemoteAudioMediaStreamTrackRendererManager::dispatchMessage):
* GPUProcess/webrtc/RemoteAudioMediaStreamTrackRendererManager.h:
* GPUProcess/webrtc/RemoteAudioMediaStreamTrackRendererManager.messages.in:
* GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.cpp:
(WebKit::RemoteSampleBufferDisplayLayerManager::RemoteSampleBufferDisplayLayerManager):
(WebKit::RemoteSampleBufferDisplayLayerManager::~RemoteSampleBufferDisplayLayerManager):
(WebKit::RemoteSampleBufferDisplayLayerManager::close):
(WebKit::RemoteSampleBufferDisplayLayerManager::dispatchToThread):
(WebKit::RemoteSampleBufferDisplayLayerManager::dispatchMessage):
(WebKit::RemoteSampleBufferDisplayLayerManager::createLayer):
(WebKit::RemoteSampleBufferDisplayLayerManager::releaseLayer):
* GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h:
* GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.messages.in:
* Scripts/webkit/messages.py:
* SourcesCocoa.txt:


Canonical link: https://commits.webkit.org/229432@main
git-svn-id: https://svn.webkit.org/repository/webkit/trunk@267156 268f45cc-cd09-0410-ab3c-d52691b4dbfc
  • Loading branch information
youennf committed Sep 16, 2020
1 parent a581e93 commit efa3774b60f7466dab1ecd9ab42f9202b6f809e0
@@ -1,3 +1,44 @@
2020-09-16 Youenn Fablet <youenn@apple.com>

RemoteSampleBufferDisplayLayer should process its IPC messages from a background thread
https://bugs.webkit.org/show_bug.cgi?id=216475

Reviewed by Eric Carlson.

Use a process wide work queue to process all track video renderer messages in a background thread.
This queue is shared by all RemoteSampleBufferDisplayLayerManager and RemoteSampleBufferDisplayLayer objects, which are per web process.
Since LocalSampleBufferDisplayLayer must currently be created in main thread, hop to main thread to create/delete renderers.

We update messages.py to support WantsAsyncDispatchMessage, which removes the sync handling with dispatchMessage.
This allows RemoteSampleBufferDisplayLayerManager to also handle messages from RemoteSampleBufferDisplayLayer.
Update RemoteSampleBufferDisplayLayerManager accordingly as well.

Covered by webrtc/video.html with GPU process enabled.

* GPUProcess/GPUConnectionToWebProcess.cpp:
(WebKit::GPUConnectionToWebProcess::GPUConnectionToWebProcess):
(WebKit::GPUConnectionToWebProcess::dispatchMessage):
* GPUProcess/GPUConnectionToWebProcess.h:
* GPUProcess/GPUProcess.cpp:
(WebKit::GPUProcess::videoMediaStreamTrackRendererQueue):
* GPUProcess/GPUProcess.h:
* GPUProcess/webrtc/RemoteAudioMediaStreamTrackRendererManager.cpp:
(WebKit::RemoteAudioMediaStreamTrackRendererManager::dispatchMessage):
* GPUProcess/webrtc/RemoteAudioMediaStreamTrackRendererManager.h:
* GPUProcess/webrtc/RemoteAudioMediaStreamTrackRendererManager.messages.in:
* GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.cpp:
(WebKit::RemoteSampleBufferDisplayLayerManager::RemoteSampleBufferDisplayLayerManager):
(WebKit::RemoteSampleBufferDisplayLayerManager::~RemoteSampleBufferDisplayLayerManager):
(WebKit::RemoteSampleBufferDisplayLayerManager::close):
(WebKit::RemoteSampleBufferDisplayLayerManager::dispatchToThread):
(WebKit::RemoteSampleBufferDisplayLayerManager::dispatchMessage):
(WebKit::RemoteSampleBufferDisplayLayerManager::createLayer):
(WebKit::RemoteSampleBufferDisplayLayerManager::releaseLayer):
* GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h:
* GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.messages.in:
* Scripts/webkit/messages.py:
* SourcesCocoa.txt:

2020-09-16 Alex Christensen <achristensen@webkit.org>

Move TLS certificate bypass SPI from WebProcessPool to WebsiteDataStore
@@ -159,6 +159,7 @@ GPUConnectionToWebProcess::GPUConnectionToWebProcess(GPUProcess& gpuProcess, Web
#endif
#if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
, m_audioTrackRendererManager(RemoteAudioMediaStreamTrackRendererManager::create(*this))
, m_sampleBufferDisplayLayerManager(RemoteSampleBufferDisplayLayerManager::create(*this))
#endif
{
RELEASE_ASSERT(RunLoop::isMain());
@@ -173,6 +174,7 @@ GPUConnectionToWebProcess::~GPUConnectionToWebProcess()

#if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
m_audioTrackRendererManager->close();
m_sampleBufferDisplayLayerManager->close();
#endif
#if PLATFORM(COCOA) && USE(LIBWEBRTC)
m_libWebRTCCodecsProxy->close();
@@ -249,14 +251,6 @@ RemoteMediaRecorderManager& GPUConnectionToWebProcess::mediaRecorderManager()
return *m_remoteMediaRecorderManager;
}
#endif

RemoteSampleBufferDisplayLayerManager& GPUConnectionToWebProcess::sampleBufferDisplayLayerManager()
{
if (!m_sampleBufferDisplayLayerManager)
m_sampleBufferDisplayLayerManager = makeUnique<RemoteSampleBufferDisplayLayerManager>(m_connection.copyRef());

return *m_sampleBufferDisplayLayerManager;
}
#endif // PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)

#if ENABLE(ENCRYPTED_MEDIA)
@@ -375,14 +369,6 @@ bool GPUConnectionToWebProcess::dispatchMessage(IPC::Connection& connection, IPC
return true;
}
#endif // HAVE(AVASSETWRITERDELEGATE)
if (decoder.messageReceiverName() == Messages::RemoteSampleBufferDisplayLayerManager::messageReceiverName()) {
sampleBufferDisplayLayerManager().didReceiveMessageFromWebProcess(connection, decoder);
return true;
}
if (decoder.messageReceiverName() == Messages::RemoteSampleBufferDisplayLayer::messageReceiverName()) {
sampleBufferDisplayLayerManager().didReceiveLayerMessage(connection, decoder);
return true;
}
#endif // PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
#if ENABLE(ENCRYPTED_MEDIA)
if (decoder.messageReceiverName() == Messages::RemoteCDMFactoryProxy::messageReceiverName()) {
@@ -117,7 +117,6 @@ class GPUConnectionToWebProcess
#if HAVE(AVASSETWRITERDELEGATE)
RemoteMediaRecorderManager& mediaRecorderManager();
#endif
RemoteSampleBufferDisplayLayerManager& sampleBufferDisplayLayerManager();
#endif

#if ENABLE(GPU_PROCESS) && USE(AUDIO_SESSION)
@@ -173,10 +172,10 @@ class GPUConnectionToWebProcess
#if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
std::unique_ptr<UserMediaCaptureManagerProxy> m_userMediaCaptureManagerProxy;
Ref<RemoteAudioMediaStreamTrackRendererManager> m_audioTrackRendererManager;
Ref<RemoteSampleBufferDisplayLayerManager> m_sampleBufferDisplayLayerManager;
#if HAVE(AVASSETWRITERDELEGATE)
std::unique_ptr<RemoteMediaRecorderManager> m_remoteMediaRecorderManager;
#endif
std::unique_ptr<RemoteSampleBufferDisplayLayerManager> m_sampleBufferDisplayLayerManager;
#endif
#if ENABLE(MEDIA_STREAM)
bool m_allowsAudioCapture { false };
@@ -268,6 +268,12 @@ WorkQueue& GPUProcess::audioMediaStreamTrackRendererQueue()
m_audioMediaStreamTrackRendererQueue = WorkQueue::create("RemoteAudioMediaStreamTrackRenderer", WorkQueue::Type::Serial, WorkQueue::QOS::UserInteractive);
return *m_audioMediaStreamTrackRendererQueue;
}
WorkQueue& GPUProcess::videoMediaStreamTrackRendererQueue()
{
if (!m_videoMediaStreamTrackRendererQueue)
m_videoMediaStreamTrackRendererQueue = WorkQueue::create("RemoteVideoMediaStreamTrackRenderer", WorkQueue::Type::Serial, WorkQueue::QOS::UserInitiated);
return *m_videoMediaStreamTrackRendererQueue;
}
#endif

#if USE(LIBWEBRTC) && PLATFORM(COCOA)
@@ -77,6 +77,7 @@ class GPUProcess : public AuxiliaryProcess, public ThreadSafeRefCounted<GPUProce

#if ENABLE(MEDIA_STREAM) && PLATFORM(COCOA)
WorkQueue& audioMediaStreamTrackRendererQueue();
WorkQueue& videoMediaStreamTrackRendererQueue();
#endif
#if USE(LIBWEBRTC) && PLATFORM(COCOA)
WorkQueue& libWebRTCCodecsQueue();
@@ -123,8 +124,9 @@ class GPUProcess : public AuxiliaryProcess, public ThreadSafeRefCounted<GPUProce
bool allowDisplayCapture { false };
};
HashMap<WebCore::ProcessIdentifier, MediaCaptureAccess> m_mediaCaptureAccessMap;
#if PLATFORM(COCOA)
#if ENABLE(MEDIA_STREAM) && PLATFORM(COCOA)
RefPtr<WorkQueue> m_audioMediaStreamTrackRendererQueue;
RefPtr<WorkQueue> m_videoMediaStreamTrackRendererQueue;
#endif
#endif
#if USE(LIBWEBRTC) && PLATFORM(COCOA)
@@ -63,21 +63,15 @@ void RemoteAudioMediaStreamTrackRendererManager::dispatchToThread(Function<void(
m_queue->dispatch(WTFMove(callback));
}

void RemoteAudioMediaStreamTrackRendererManager::didReceiveMessage(IPC::Connection& connection, IPC::Decoder& decoder)
bool RemoteAudioMediaStreamTrackRendererManager::dispatchMessage(IPC::Connection& connection, IPC::Decoder& decoder)
{
if (!decoder.destinationID()) {
if (decoder.messageName() == Messages::RemoteAudioMediaStreamTrackRendererManager::CreateRenderer::name()) {
IPC::handleMessage<Messages::RemoteAudioMediaStreamTrackRendererManager::CreateRenderer>(decoder, this, &RemoteAudioMediaStreamTrackRendererManager::createRenderer);
return;
}
if (decoder.messageName() == Messages::RemoteAudioMediaStreamTrackRendererManager::ReleaseRenderer::name()) {
IPC::handleMessage<Messages::RemoteAudioMediaStreamTrackRendererManager::ReleaseRenderer>(decoder, this, &RemoteAudioMediaStreamTrackRendererManager::releaseRenderer);
return;
}
return;
}
if (auto* renderer = m_renderers.get(makeObjectIdentifier<AudioMediaStreamTrackRendererIdentifierType>(decoder.destinationID())))
if (!decoder.destinationID())
return false;

auto identifier = makeObjectIdentifier<AudioMediaStreamTrackRendererIdentifierType>(decoder.destinationID());
if (auto* renderer = m_renderers.get(identifier))
renderer->didReceiveMessage(connection, decoder);
return true;
}

void RemoteAudioMediaStreamTrackRendererManager::createRenderer(AudioMediaStreamTrackRendererIdentifier identifier)
@@ -63,6 +63,7 @@ class RemoteAudioMediaStreamTrackRendererManager final : public IPC::Connection:

// IPC::MessageReceiver
void didReceiveMessage(IPC::Connection&, IPC::Decoder&) final;
bool dispatchMessage(IPC::Connection&, IPC::Decoder&);
void createRenderer(AudioMediaStreamTrackRendererIdentifier);
void releaseRenderer(AudioMediaStreamTrackRendererIdentifier);

@@ -23,7 +23,7 @@

#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)

messages -> RemoteAudioMediaStreamTrackRendererManager NotRefCounted {
messages -> RemoteAudioMediaStreamTrackRendererManager WantsAsyncDispatchMessage {
CreateRenderer(WebKit::AudioMediaStreamTrackRendererIdentifier id)
ReleaseRenderer(WebKit::AudioMediaStreamTrackRendererIdentifier id)
}
@@ -29,36 +29,75 @@
#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)

#include "Decoder.h"
#include "GPUConnectionToWebProcess.h"
#include "RemoteSampleBufferDisplayLayer.h"
#include "RemoteSampleBufferDisplayLayerManagerMessages.h"
#include "RemoteSampleBufferDisplayLayerMessages.h"
#include <WebCore/IntSize.h>

namespace WebKit {

RemoteSampleBufferDisplayLayerManager::RemoteSampleBufferDisplayLayerManager(Ref<IPC::Connection>&& connection)
: m_connection(WTFMove(connection))
RemoteSampleBufferDisplayLayerManager::RemoteSampleBufferDisplayLayerManager(GPUConnectionToWebProcess& gpuConnectionToWebProcess)
: m_connectionToWebProcess(gpuConnectionToWebProcess)
, m_connection(gpuConnectionToWebProcess.connection())
, m_queue(gpuConnectionToWebProcess.gpuProcess().videoMediaStreamTrackRendererQueue())
{
m_connectionToWebProcess.connection().addThreadMessageReceiver(Messages::RemoteSampleBufferDisplayLayer::messageReceiverName(), this);
m_connectionToWebProcess.connection().addThreadMessageReceiver(Messages::RemoteSampleBufferDisplayLayerManager::messageReceiverName(), this);
}

RemoteSampleBufferDisplayLayerManager::~RemoteSampleBufferDisplayLayerManager() = default;
RemoteSampleBufferDisplayLayerManager::~RemoteSampleBufferDisplayLayerManager()
{
m_connectionToWebProcess.connection().removeThreadMessageReceiver(Messages::RemoteSampleBufferDisplayLayer::messageReceiverName());
m_connectionToWebProcess.connection().removeThreadMessageReceiver(Messages::RemoteSampleBufferDisplayLayerManager::messageReceiverName());
}

void RemoteSampleBufferDisplayLayerManager::close()
{
dispatchToThread([this, protectedThis = makeRef(*this)] {
callOnMainThread([layers = WTFMove(m_layers)] { });
});
}

void RemoteSampleBufferDisplayLayerManager::didReceiveLayerMessage(IPC::Connection& connection, IPC::Decoder& decoder)
void RemoteSampleBufferDisplayLayerManager::dispatchToThread(Function<void()>&& callback)
{
if (auto* layer = m_layers.get(makeObjectIdentifier<SampleBufferDisplayLayerIdentifierType>(decoder.destinationID())))
m_queue->dispatch(WTFMove(callback));
}

bool RemoteSampleBufferDisplayLayerManager::dispatchMessage(IPC::Connection& connection, IPC::Decoder& decoder)
{
if (!decoder.destinationID())
return false;

auto identifier = makeObjectIdentifier<SampleBufferDisplayLayerIdentifierType>(decoder.destinationID());
if (auto* layer = m_layers.get(identifier))
layer->didReceiveMessage(connection, decoder);
return true;
}

void RemoteSampleBufferDisplayLayerManager::createLayer(SampleBufferDisplayLayerIdentifier identifier, bool hideRootLayer, WebCore::IntSize size, LayerCreationCallback&& callback)
{
ASSERT(!m_layers.contains(identifier));
auto layer = RemoteSampleBufferDisplayLayer::create(identifier, m_connection.copyRef());
layer->initialize(hideRootLayer, size, WTFMove(callback));
m_layers.add(identifier, WTFMove(layer));
callOnMainThread([this, protectedThis = makeRef(*this), identifier, hideRootLayer, size, callback = WTFMove(callback)]() mutable {
auto layer = RemoteSampleBufferDisplayLayer::create(identifier, m_connection.copyRef());
auto& layerReference = *layer;
layerReference.initialize(hideRootLayer, size, [this, protectedThis = makeRef(*this), callback = WTFMove(callback), identifier, layer = WTFMove(layer)](auto layerId) mutable {
dispatchToThread([this, protectedThis = WTFMove(protectedThis), callback = WTFMove(callback), identifier, layer = WTFMove(layer), layerId = WTFMove(layerId)]() mutable {
ASSERT(!m_layers.contains(identifier));
m_layers.add(identifier, WTFMove(layer));
callback(WTFMove(layerId));
});
});
});
}

void RemoteSampleBufferDisplayLayerManager::releaseLayer(SampleBufferDisplayLayerIdentifier identifier)
{
ASSERT(m_layers.contains(identifier));
m_layers.remove(identifier);
callOnMainThread([this, protectedThis = makeRef(*this), identifier]() mutable {
dispatchToThread([this, protectedThis = WTFMove(protectedThis), identifier] {
ASSERT(m_layers.contains(identifier));
callOnMainThread([layer = m_layers.take(identifier)] { });
});
});
}

}
@@ -27,14 +27,13 @@

#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)

#include "MessageReceiver.h"
#include "Connection.h"
#include "RemoteSampleBufferDisplayLayerManagerMessagesReplies.h"
#include "SampleBufferDisplayLayerIdentifier.h"
#include <WebCore/IntSize.h>
#include <wtf/HashMap.h>

namespace IPC {
class Connection;
class Decoder;
}

@@ -44,26 +43,34 @@ class IntSize;

namespace WebKit {

class GPUConnectionToWebProcess;
class RemoteSampleBufferDisplayLayer;

class RemoteSampleBufferDisplayLayerManager final : private IPC::MessageReceiver {
class RemoteSampleBufferDisplayLayerManager final : public IPC::Connection::ThreadMessageReceiver {
WTF_MAKE_FAST_ALLOCATED;
public:
explicit RemoteSampleBufferDisplayLayerManager(Ref<IPC::Connection>&&);
static Ref<RemoteSampleBufferDisplayLayerManager> create(GPUConnectionToWebProcess& connection) { return adoptRef(*new RemoteSampleBufferDisplayLayerManager(connection)); }
~RemoteSampleBufferDisplayLayerManager();

void didReceiveLayerMessage(IPC::Connection&, IPC::Decoder&);
void didReceiveMessageFromWebProcess(IPC::Connection& connection, IPC::Decoder& decoder) { didReceiveMessage(connection, decoder); }
void close();

private:
explicit RemoteSampleBufferDisplayLayerManager(GPUConnectionToWebProcess&);

// IPC::Connection::ThreadMessageReceiver
void dispatchToThread(Function<void()>&&) final;

// IPC::MessageReceiver
void didReceiveMessage(IPC::Connection&, IPC::Decoder&) final;
bool dispatchMessage(IPC::Connection&, IPC::Decoder&);

using LayerCreationCallback = CompletionHandler<void(Optional<LayerHostingContextID>)>&&;
void createLayer(SampleBufferDisplayLayerIdentifier, bool hideRootLayer, WebCore::IntSize, LayerCreationCallback);
void releaseLayer(SampleBufferDisplayLayerIdentifier);

GPUConnectionToWebProcess& m_connectionToWebProcess;
Ref<IPC::Connection> m_connection;
Ref<WorkQueue> m_queue;
HashMap<SampleBufferDisplayLayerIdentifier, std::unique_ptr<RemoteSampleBufferDisplayLayer>> m_layers;
};

@@ -23,7 +23,7 @@

#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)

messages -> RemoteSampleBufferDisplayLayerManager NotRefCounted {
messages -> RemoteSampleBufferDisplayLayerManager WantsAsyncDispatchMessage {
CreateLayer(WebKit::SampleBufferDisplayLayerIdentifier id, bool hideRootLayer, WebCore::IntSize size) -> (Optional<WebKit::LayerHostingContextID> contextID) Async
ReleaseLayer(WebKit::SampleBufferDisplayLayerIdentifier id)
}
@@ -28,6 +28,7 @@

WANTS_CONNECTION_ATTRIBUTE = 'WantsConnection'
WANTS_DISPATCH_MESSAGE_ATTRIBUTE = 'WantsDispatchMessage'
WANTS_ASYNC_DISPATCH_MESSAGE_ATTRIBUTE = 'WantsAsyncDispatchMessage'
LEGACY_RECEIVER_ATTRIBUTE = 'LegacyReceiver'
NOT_REFCOUNTED_RECEIVER_ATTRIBUTE = 'NotRefCounted'
SYNCHRONOUS_ATTRIBUTE = 'Synchronous'
@@ -836,14 +837,14 @@ def generate_message_handler(receiver):
else:
async_messages.append(message)

if async_messages or receiver.has_attribute(WANTS_DISPATCH_MESSAGE_ATTRIBUTE):
if async_messages or receiver.has_attribute(WANTS_DISPATCH_MESSAGE_ATTRIBUTE) or receiver.has_attribute(WANTS_ASYNC_DISPATCH_MESSAGE_ATTRIBUTE):
result.append('void %s::didReceive%sMessage(IPC::Connection& connection, IPC::Decoder& decoder)\n' % (receiver.name, receiver.name if receiver.has_attribute(LEGACY_RECEIVER_ATTRIBUTE) else ''))
result.append('{\n')
if not receiver.has_attribute(NOT_REFCOUNTED_RECEIVER_ATTRIBUTE):
result.append(' auto protectedThis = makeRef(*this);\n')

result += [async_message_statement(receiver, message) for message in async_messages]
if receiver.has_attribute(WANTS_DISPATCH_MESSAGE_ATTRIBUTE):
if receiver.has_attribute(WANTS_DISPATCH_MESSAGE_ATTRIBUTE) or receiver.has_attribute(WANTS_ASYNC_DISPATCH_MESSAGE_ATTRIBUTE):
result.append(' if (dispatchMessage(connection, decoder))\n')
result.append(' return;\n')
if (receiver.superclass):
@@ -664,6 +664,7 @@ RemoteAudioDestinationManagerMessageReceiver.cpp
RemoteAudioDestinationProxyMessageReceiver.cpp
RemoteAudioSessionMessageReceiver.cpp
RemoteAudioSessionProxyMessageReceiver.cpp
RemoteAudioMediaStreamTrackRendererManagerMessageReceiver.cpp
RemoteAudioMediaStreamTrackRendererMessageReceiver.cpp
RemoteMediaRecorderMessageReceiver.cpp
RemoteMediaRecorderManagerMessageReceiver.cpp

0 comments on commit efa3774

Please sign in to comment.