Skip to content
Permalink
Browse files
[GStreamer][WebRTC] Huge memory leak
https://bugs.webkit.org/show_bug.cgi?id=234134

Patch by Philippe Normand <pnormand@igalia.com> on 2021-12-17
Reviewed by Xabier Rodriguez-Calvar.

Source/WebCore:

The main issue was RealtimeOutgoingVideoSourceLibWebRTC leaking GstSamples. Fixing this lead
me to further clean-ups in the GstSample<->LibWebRTCVideoFrame handling. Native frames
should not consume GstSamples, but reference them, otherwise there are crashes where the
dangling GstSamples would be passed to the libwebrtc video decoder...

Also the video decoders were not reporting their initialization status correctly,
WEBRTC_VIDEO_CODEC_OK is 0 hence false...

* platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
(WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
(WebCore::MediaSampleGStreamer::initializeFromBuffer):
* platform/graphics/gstreamer/MediaSampleGStreamer.h:
(WebCore::MediaSampleGStreamer::createWrappedSample):
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoDecoderFactory.cpp:
(WebCore::GStreamerVideoDecoder::pullSample):
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoEncoderFactory.cpp:
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.cpp:
(WebCore::convertLibWebRTCVideoFrameToGStreamerSample):
(WebCore::convertGStreamerSampleToLibWebRTCVideoFrame):
(WebCore::GStreamerVideoFrameLibWebRTC::create):
(WebCore::GStreamerSampleFromLibWebRTCVideoFrame): Deleted.
(WebCore::LibWebRTCVideoFrameFromGStreamerSample): Deleted.
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.h:
(WebCore::GStreamerVideoFrameLibWebRTC::GStreamerVideoFrameLibWebRTC):
(WebCore::GStreamerVideoFrameLibWebRTC::getSample const):
(WebCore::GStreamerVideoFrameLibWebRTC::takeSample): Deleted.
* platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp:
(WebCore::RealtimeIncomingVideoSourceLibWebRTC::OnFrame):
* platform/mediastream/libwebrtc/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp:
(WebCore::RealtimeOutgoingVideoSourceLibWebRTC::videoSampleAvailable):

LayoutTests:

* platform/glib/TestExpectations: Update test expectations, #233740 was mostly due to video
decoder initialization wrongly reported as failed.
webrtc/captureCanvas-webrtc-software-h264-baseline.html no longer times out.


Canonical link: https://commits.webkit.org/245349@main
git-svn-id: https://svn.webkit.org/repository/webkit/trunk@287180 268f45cc-cd09-0410-ab3c-d52691b4dbfc
  • Loading branch information
philn authored and webkit-commit-queue committed Dec 17, 2021
1 parent a18149a commit da51c9d94b57995c7d92f8086a60a731b562f2da
Showing 11 changed files with 129 additions and 63 deletions.
@@ -1,3 +1,14 @@
2021-12-17 Philippe Normand <pnormand@igalia.com>

[GStreamer][WebRTC] Huge memory leak
https://bugs.webkit.org/show_bug.cgi?id=234134

Reviewed by Xabier Rodriguez-Calvar.

* platform/glib/TestExpectations: Update test expectations, #233740 was mostly due to video
decoder initialization wrongly reported as failed.
webrtc/captureCanvas-webrtc-software-h264-baseline.html no longer times out.

2021-12-16 Tyler Wilcock <tyler_w@apple.com>

AX: Make aria-multiline.html, clipped-text-under-element.html, mixed-checkbox.html, and selection-states.html pass in isolated tree mode
@@ -1349,8 +1349,7 @@ webkit.org/b/206464 webkit.org/b/218787 webrtc/peerconnection-page-cache-long.ht

webkit.org/b/208125 webrtc/peerconnection-new-candidate-page-cache.html [ Pass Crash ]

# Was Slow Failure before libwebrtc bumped to M96 in r285577
webkit.org/b/216538 webrtc/captureCanvas-webrtc-software-h264-baseline.html [ Timeout ]
webkit.org/b/216538 webrtc/captureCanvas-webrtc-software-h264-baseline.html [ Slow Failure ]

webkit.org/b/216763 webrtc/captureCanvas-webrtc-software-h264-high.html [ Crash Failure ]

@@ -1370,10 +1369,6 @@ webkit.org/b/227987 http/tests/media/media-stream/get-display-media-prompt.html

webkit.org/b/229055 http/wpt/webrtc/sframe-transform-error.html [ Failure ]

webkit.org/b/233740 http/wpt/webrtc/video-script-transform-keyframe-only.html [ Failure ]
webkit.org/b/233740 webrtc/vp9-svc.html [ Timeout ]
webkit.org/b/233740 webrtc/vp9.html [ Timeout ]

webkit.org/b/233879 fast/mediastream/get-display-media-settings.html [ Failure ]

#////////////////////////////////////////////////////////////////////////////////////////
@@ -1,3 +1,41 @@
2021-12-17 Philippe Normand <pnormand@igalia.com>

[GStreamer][WebRTC] Huge memory leak
https://bugs.webkit.org/show_bug.cgi?id=234134

Reviewed by Xabier Rodriguez-Calvar.

The main issue was RealtimeOutgoingVideoSourceLibWebRTC leaking GstSamples. Fixing this lead
me to further clean-ups in the GstSample<->LibWebRTCVideoFrame handling. Native frames
should not consume GstSamples, but reference them, otherwise there are crashes where the
dangling GstSamples would be passed to the libwebrtc video decoder...

Also the video decoders were not reporting their initialization status correctly,
WEBRTC_VIDEO_CODEC_OK is 0 hence false...

* platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
(WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
(WebCore::MediaSampleGStreamer::initializeFromBuffer):
* platform/graphics/gstreamer/MediaSampleGStreamer.h:
(WebCore::MediaSampleGStreamer::createWrappedSample):
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoDecoderFactory.cpp:
(WebCore::GStreamerVideoDecoder::pullSample):
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoEncoderFactory.cpp:
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.cpp:
(WebCore::convertLibWebRTCVideoFrameToGStreamerSample):
(WebCore::convertGStreamerSampleToLibWebRTCVideoFrame):
(WebCore::GStreamerVideoFrameLibWebRTC::create):
(WebCore::GStreamerSampleFromLibWebRTCVideoFrame): Deleted.
(WebCore::LibWebRTCVideoFrameFromGStreamerSample): Deleted.
* platform/mediastream/libwebrtc/gstreamer/GStreamerVideoFrameLibWebRTC.h:
(WebCore::GStreamerVideoFrameLibWebRTC::GStreamerVideoFrameLibWebRTC):
(WebCore::GStreamerVideoFrameLibWebRTC::getSample const):
(WebCore::GStreamerVideoFrameLibWebRTC::takeSample): Deleted.
* platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp:
(WebCore::RealtimeIncomingVideoSourceLibWebRTC::OnFrame):
* platform/mediastream/libwebrtc/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp:
(WebCore::RealtimeOutgoingVideoSourceLibWebRTC::videoSampleAvailable):

2021-12-17 Antoine Quint <graouts@webkit.org>

ActiveDOMObject::suspendIfNeeded() should not be called within constructors
@@ -42,43 +42,17 @@ MediaSampleGStreamer::MediaSampleGStreamer(GRefPtr<GstSample>&& sample, const Fl
, m_videoRotation(videoRotation)
, m_videoMirrored(videoMirrored)
{
const GstClockTime minimumDuration = 1000; // 1 us
ASSERT(sample);
GstBuffer* buffer = gst_sample_get_buffer(sample.get());
RELEASE_ASSERT(buffer);

if (metadata)
buffer = webkitGstBufferSetVideoSampleMetadata(buffer, WTFMove(metadata));

if (GST_BUFFER_PTS_IS_VALID(buffer))
m_pts = fromGstClockTime(GST_BUFFER_PTS(buffer));
if (GST_BUFFER_DTS_IS_VALID(buffer) || GST_BUFFER_PTS_IS_VALID(buffer))
m_dts = fromGstClockTime(GST_BUFFER_DTS_OR_PTS(buffer));
if (GST_BUFFER_DURATION_IS_VALID(buffer)) {
// Sometimes (albeit rarely, so far seen only at the end of a track)
// frames have very small durations, so small that may be under the
// precision we are working with and be truncated to zero.
// SourceBuffer algorithms are not expecting frames with zero-duration,
// so let's use something very small instead in those fringe cases.
m_duration = fromGstClockTime(std::max(GST_BUFFER_DURATION(buffer), minimumDuration));
} else {
// Unfortunately, sometimes samples don't provide a duration. This can never happen in MP4 because of the way
// the format is laid out, but it's pretty common in WebM.
// The good part is that durations don't matter for playback, just for buffered ranges and coded frame deletion.
// We want to pick something small enough to not cause unwanted frame deletion, but big enough to never be
// mistaken for a rounding artifact.
m_duration = fromGstClockTime(16666667); // 1/60 seconds
}

m_size = gst_buffer_get_size(buffer);
m_sample = adoptGRef(gst_sample_new(buffer, gst_sample_get_caps(sample.get()), nullptr,
gst_sample_get_info(sample.get()) ? gst_structure_copy(gst_sample_get_info(sample.get())) : nullptr));

if (GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DELTA_UNIT))
m_flags = MediaSample::None;

if (GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DECODE_ONLY))
m_flags = static_cast<MediaSample::SampleFlags>(m_flags | MediaSample::IsNonDisplaying);
initializeFromBuffer();
}

MediaSampleGStreamer::MediaSampleGStreamer(const FloatSize& presentationSize, const AtomString& trackId)
@@ -90,6 +64,13 @@ MediaSampleGStreamer::MediaSampleGStreamer(const FloatSize& presentationSize, co
{
}

MediaSampleGStreamer::MediaSampleGStreamer(const GRefPtr<GstSample>& sample, VideoRotation videoRotation)
: m_sample(sample)
, m_videoRotation(videoRotation)
{
initializeFromBuffer();
}

Ref<MediaSampleGStreamer> MediaSampleGStreamer::createFakeSample(GstCaps*, MediaTime pts, MediaTime dts, MediaTime duration, const FloatSize& presentationSize, const AtomString& trackId)
{
MediaSampleGStreamer* gstreamerMediaSample = new MediaSampleGStreamer(presentationSize, trackId);
@@ -154,6 +135,41 @@ Ref<MediaSampleGStreamer> MediaSampleGStreamer::createImageSample(PixelBuffer&&
return create(WTFMove(sample), FloatSize(width, height), { }, videoRotation, videoMirrored);
}

void MediaSampleGStreamer::initializeFromBuffer()
{
const GstClockTime minimumDuration = 1000; // 1 us
auto* buffer = gst_sample_get_buffer(m_sample.get());
RELEASE_ASSERT(buffer);

if (GST_BUFFER_PTS_IS_VALID(buffer))
m_pts = fromGstClockTime(GST_BUFFER_PTS(buffer));
if (GST_BUFFER_DTS_IS_VALID(buffer) || GST_BUFFER_PTS_IS_VALID(buffer))
m_dts = fromGstClockTime(GST_BUFFER_DTS_OR_PTS(buffer));
if (GST_BUFFER_DURATION_IS_VALID(buffer)) {
// Sometimes (albeit rarely, so far seen only at the end of a track)
// frames have very small durations, so small that may be under the
// precision we are working with and be truncated to zero.
// SourceBuffer algorithms are not expecting frames with zero-duration,
// so let's use something very small instead in those fringe cases.
m_duration = fromGstClockTime(std::max(GST_BUFFER_DURATION(buffer), minimumDuration));
} else {
// Unfortunately, sometimes samples don't provide a duration. This can never happen in MP4 because of the way
// the format is laid out, but it's pretty common in WebM.
// The good part is that durations don't matter for playback, just for buffered ranges and coded frame deletion.
// We want to pick something small enough to not cause unwanted frame deletion, but big enough to never be
// mistaken for a rounding artifact.
m_duration = fromGstClockTime(16666667); // 1/60 seconds
}

m_size = gst_buffer_get_size(buffer);

if (GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DELTA_UNIT))
m_flags = MediaSample::None;

if (GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DECODE_ONLY))
m_flags = static_cast<MediaSample::SampleFlags>(m_flags | MediaSample::IsNonDisplaying);
}

RefPtr<JSC::Uint8ClampedArray> MediaSampleGStreamer::getRGBAImageData() const
{
auto* caps = gst_sample_get_caps(m_sample.get());
@@ -40,6 +40,11 @@ class MediaSampleGStreamer : public MediaSample {
return adoptRef(*new MediaSampleGStreamer(WTFMove(sample), presentationSize, trackId, videoRotation, videoMirrored, WTFMove(metadata)));
}

static Ref<MediaSampleGStreamer> createWrappedSample(const GRefPtr<GstSample>& sample, VideoRotation videoRotation = VideoRotation::None)
{
return adoptRef(*new MediaSampleGStreamer(sample, videoRotation));
}

static Ref<MediaSampleGStreamer> createFakeSample(GstCaps*, MediaTime pts, MediaTime dts, MediaTime duration, const FloatSize& presentationSize, const AtomString& trackId);
static Ref<MediaSampleGStreamer> createImageSample(PixelBuffer&&, const IntSize& destinationSize = { }, double frameRate = 1, VideoRotation videoRotation = VideoRotation::None, bool videoMirrored = false, std::optional<VideoSampleMetadata>&& metadata = std::nullopt);

@@ -66,11 +71,14 @@ class MediaSampleGStreamer : public MediaSample {

protected:
MediaSampleGStreamer(GRefPtr<GstSample>&&, const FloatSize& presentationSize, const AtomString& trackId, VideoRotation = VideoRotation::None, bool videoMirrored = false, std::optional<VideoSampleMetadata>&& = std::nullopt);
MediaSampleGStreamer(const GRefPtr<GstSample>&, VideoRotation = VideoRotation::None);
virtual ~MediaSampleGStreamer() = default;

private:
MediaSampleGStreamer(const FloatSize& presentationSize, const AtomString& trackId);

void initializeFromBuffer();

MediaTime m_pts;
MediaTime m_dts;
MediaTime m_duration;
@@ -150,7 +150,7 @@ class GStreamerVideoDecoder : public webrtc::VideoDecoder {
return WEBRTC_VIDEO_CODEC_ERROR;
}

return WEBRTC_VIDEO_CODEC_OK;
return true;
}

int32_t RegisterDecodeCompleteCallback(webrtc::DecodedImageCallback* callback) override
@@ -240,7 +240,7 @@ class GStreamerVideoDecoder : public webrtc::VideoDecoder {
auto timestamps = m_dtsPtsMap[GST_BUFFER_PTS(buffer)];
m_dtsPtsMap.erase(GST_BUFFER_PTS(buffer));

auto frame(LibWebRTCVideoFrameFromGStreamerSample(WTFMove(sample), webrtc::kVideoRotation_0,
auto frame(convertGStreamerSampleToLibWebRTCVideoFrame(sample, webrtc::kVideoRotation_0,
timestamps.timestamp, timestamps.renderTimeMs));

GST_BUFFER_DTS(buffer) = GST_CLOCK_TIME_NONE;
@@ -228,7 +228,7 @@ class GStreamerVideoEncoder : public webrtc::VideoEncoder {
return WEBRTC_VIDEO_CODEC_UNINITIALIZED;
}

auto sample = GStreamerSampleFromLibWebRTCVideoFrame(frame);
auto sample = convertLibWebRTCVideoFrameToGStreamerSample(frame);
auto buffer = gst_sample_get_buffer(sample.get());

if (!GST_CLOCK_TIME_IS_VALID(m_firstFramePts)) {
@@ -27,13 +27,9 @@

namespace WebCore {

GRefPtr<GstSample> GStreamerSampleFromLibWebRTCVideoFrame(const webrtc::VideoFrame& frame)
GRefPtr<GstSample> convertLibWebRTCVideoFrameToGStreamerSample(const webrtc::VideoFrame& frame)
{
if (frame.video_frame_buffer()->type() == webrtc::VideoFrameBuffer::Type::kNative) {
auto* framebuffer = static_cast<GStreamerVideoFrameLibWebRTC*>(frame.video_frame_buffer().get());
return framebuffer->takeSample();
}

RELEASE_ASSERT(frame.video_frame_buffer()->type() != webrtc::VideoFrameBuffer::Type::kNative);
auto* i420Buffer = frame.video_frame_buffer()->ToI420().release();
int height = i420Buffer->height();
int strides[3] = {
@@ -60,21 +56,20 @@ GRefPtr<GstSample> GStreamerSampleFromLibWebRTCVideoFrame(const webrtc::VideoFra
return sample;
}

rtc::scoped_refptr<webrtc::VideoFrameBuffer> GStreamerVideoFrameLibWebRTC::create(GRefPtr<GstSample>&& sample)
std::unique_ptr<webrtc::VideoFrame> convertGStreamerSampleToLibWebRTCVideoFrame(GRefPtr<GstSample>& sample, webrtc::VideoRotation rotation, int64_t timestamp, int64_t renderTimeMs)
{
auto frameBuffer(GStreamerVideoFrameLibWebRTC::create(sample));
return std::unique_ptr<webrtc::VideoFrame>(new webrtc::VideoFrame(frameBuffer, timestamp, renderTimeMs, rotation));
}

rtc::scoped_refptr<webrtc::VideoFrameBuffer> GStreamerVideoFrameLibWebRTC::create(const GRefPtr<GstSample>& sample)
{
GstVideoInfo info;

if (!gst_video_info_from_caps(&info, gst_sample_get_caps(sample.get())))
ASSERT_NOT_REACHED();

return rtc::scoped_refptr<webrtc::VideoFrameBuffer>(new GStreamerVideoFrameLibWebRTC(WTFMove(sample), info));
}

std::unique_ptr<webrtc::VideoFrame> LibWebRTCVideoFrameFromGStreamerSample(GRefPtr<GstSample>&& sample, webrtc::VideoRotation rotation,
int64_t timestamp, int64_t renderTimeMs)
{
auto frameBuffer(GStreamerVideoFrameLibWebRTC::create(WTFMove(sample)));
return std::unique_ptr<webrtc::VideoFrame>(new webrtc::VideoFrame(frameBuffer, timestamp, renderTimeMs, rotation));
return rtc::scoped_refptr<webrtc::VideoFrameBuffer>(new GStreamerVideoFrameLibWebRTC(sample, info));
}

rtc::scoped_refptr<webrtc::I420BufferInterface> GStreamerVideoFrameLibWebRTC::ToI420()
@@ -31,19 +31,19 @@

namespace WebCore {

WARN_UNUSED_RETURN GRefPtr<GstSample> GStreamerSampleFromLibWebRTCVideoFrame(const webrtc::VideoFrame&);
WARN_UNUSED_RETURN GRefPtr<GstSample> convertLibWebRTCVideoFrameToGStreamerSample(const webrtc::VideoFrame&);

std::unique_ptr<webrtc::VideoFrame> LibWebRTCVideoFrameFromGStreamerSample(GRefPtr<GstSample>&&, webrtc::VideoRotation, int64_t timestamp, int64_t renderTimeMs);
std::unique_ptr<webrtc::VideoFrame> convertGStreamerSampleToLibWebRTCVideoFrame(GRefPtr<GstSample>&, webrtc::VideoRotation, int64_t timestamp, int64_t renderTimeMs);

class GStreamerVideoFrameLibWebRTC : public rtc::RefCountedObject<webrtc::VideoFrameBuffer> {
public:
GStreamerVideoFrameLibWebRTC(GRefPtr<GstSample>&& sample, GstVideoInfo info)
: m_sample(WTFMove(sample))
GStreamerVideoFrameLibWebRTC(const GRefPtr<GstSample>& sample, GstVideoInfo info)
: m_sample(sample)
, m_info(info) { }

static rtc::scoped_refptr<webrtc::VideoFrameBuffer> create(GRefPtr<GstSample>&&);
static rtc::scoped_refptr<webrtc::VideoFrameBuffer> create(const GRefPtr<GstSample>&);

GRefPtr<GstSample>&& takeSample() { return WTFMove(m_sample); }
GstSample* getSample() const { return m_sample.get(); }
rtc::scoped_refptr<webrtc::I420BufferInterface> ToI420() final;

int width() const final { return GST_VIDEO_INFO_WIDTH(&m_info); }
@@ -55,6 +55,7 @@ class GStreamerVideoFrameLibWebRTC : public rtc::RefCountedObject<webrtc::VideoF
GRefPtr<GstSample> m_sample;
GstVideoInfo m_info;
};

}

#endif // USE(GSTREAMER) && USE(LIBWEBRTC)
@@ -58,12 +58,14 @@ void RealtimeIncomingVideoSourceLibWebRTC::OnFrame(const webrtc::VideoFrame& fra
if (!isProducingData())
return;

callOnMainThread([protectedThis = Ref { *this }, frame] {
auto gstSample = GStreamerSampleFromLibWebRTCVideoFrame(frame);
if (frame.video_frame_buffer()->type() == webrtc::VideoFrameBuffer::Type::kNative) {
auto* framebuffer = static_cast<GStreamerVideoFrameLibWebRTC*>(frame.video_frame_buffer().get());
videoSampleAvailable(MediaSampleGStreamer::createWrappedSample(framebuffer->getSample(), static_cast<MediaSample::VideoRotation>(frame.rotation())), { });
} else {
auto gstSample = convertLibWebRTCVideoFrameToGStreamerSample(frame);
auto metadata = std::make_optional(metadataFromVideoFrame(frame));
auto sample = MediaSampleGStreamer::create(WTFMove(gstSample), { }, { }, static_cast<MediaSample::VideoRotation>(frame.rotation()), false, WTFMove(metadata));
protectedThis->videoSampleAvailable(sample.get(), { });
});
videoSampleAvailable(MediaSampleGStreamer::create(WTFMove(gstSample), { }, { }, static_cast<MediaSample::VideoRotation>(frame.rotation()), false, WTFMove(metadata)), { });
}
}

} // namespace WebCore
@@ -69,7 +69,7 @@ void RealtimeOutgoingVideoSourceLibWebRTC::videoSampleAvailable(MediaSample& sam

ASSERT(sample.platformSample().type == PlatformSample::GStreamerSampleType);
auto& mediaSample = static_cast<MediaSampleGStreamer&>(sample);
auto frameBuffer = GStreamerVideoFrameLibWebRTC::create(gst_sample_ref(mediaSample.platformSample().sample.gstSample));
auto frameBuffer = GStreamerVideoFrameLibWebRTC::create(mediaSample.platformSample().sample.gstSample);

sendFrame(WTFMove(frameBuffer));
}

0 comments on commit da51c9d

Please sign in to comment.