Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable platform aac hevc #856

Merged
merged 4 commits into from
Mar 10, 2024
Merged

Enable platform aac hevc #856

merged 4 commits into from
Mar 10, 2024

Conversation

uazo
Copy link
Owner

@uazo uazo commented Mar 6, 2024

Description

Enable hardware aac and h264 codecs in android and windows

All submissions

  • there are no other open Pull Requests for the same update/change
  • Cromite can be built with these changes
  • I have tested that the new change works as intended (AVD or physical device will do)

Format

  • patch subject and filename match (e.g. Subject: Alternative cache (NIK-based) -> Alternative-cache-NIK-based.patch)
  • patch description contains explanation of changes
  • no unnecessary whitespace or unrelated changes

fixes #710

@uazo
Copy link
Owner Author

uazo commented Mar 6, 2024

@uazo
Copy link
Owner Author

uazo commented Mar 7, 2024

@uazo uazo marked this pull request as ready for review March 8, 2024 08:35
@uazo
Copy link
Owner Author

uazo commented Mar 9, 2024

I summarize, for future reference, what I understood.

  • media/media_options.gni
    defines the active modules for the various platforms and the process in which the mediaservice is to run.
    Being gpu by default, communication between render and media is done via a mojo pipe.
    Initialization is done in media::InitializeMediaLibraryInSandbox (gpu side) and MediaParserProvider::RetrieveMediaParser and RendererMainPlatformDelegate::PlatformInitialize (render side) through CreateMediaParser()

  • initialization of media render types
    chromium supports different types of media render, managed by content/renderer/media/media_factory.cc
    the default is RendererType::kMediaFoundation for android and windows. Other media renderer types are active in other cast-enabled platforms.
    It is used for creating the correct render type to associate with the mediaplayer created in MediaFactory::CreateMediaPlayer when playing the source.
    The source can be of your types: either linked to a url or a mediastream (i.e. basically handled by javascript). The logics, from our point of view, do not change.
    Depending on the source is selected in DemuxerManager::CreateDemuxer, simplifying, either FFmpegDemuxer or ChunkDemuxer.

  • codec initialization (media/renderers/default_decoder_factory.cc)
    defines the list of audio/video decoders to be used.
    the priority normally follows the order in which it is entered, although it is possible that it is changed when searching for the "rendertype" in media/base/renderer_factory_selector.cc
    in fact, in android, for audio, ffmpeg is in priority over mediacodec while in windows in priority is mediafoundation.

  • output sink detection
    devices and supported output formats are acquired.
    for android media/audio/android/audio_manager_android.cc and media/base/android/java/src/org/chromium/media/AudioManagerAndroid.java java side
    for windows media/audio/win/audio_manager_win.cc.
    information is used for passthrougth codec handling via hdmi-connected devices, i.e., in those cases where the stream does not go through the decoder but is sent directly to the device.

  • ffmpeg demuxer
    if the choice of demuxer type is ffmpeg, during initialization, it seems to me that a search for the container type is performed in FFmpegGlue::OpenContext.
    ffmpeg performs probing between different container types (media/base/container_names.h)
    at this point it is prompted to parse the streams found in the container via avformat_find_stream_info(): this is the stage where the "cog" spins because ffmpeg is looking for the correct codec.
    once found, the configuration (AudioDecoderConfig or VideoDecoderConfig) is initialized for each stream in FFmpegDemuxer::OnFindStreamInfoDone

  • choosing audio/video codec
    once the stream configuration has been identified, the audio/video rendering process begins by choosing the appropriate decoder in media/filters/decoder_selector.cc.
    the various decoders are initialized sequentially until a decoder capable of decoding the source with the given configuration is detected.

since the mediaservice can run in different processes based on the media_options.gni options, the initialization goes through DecoderStreamTraits which takes care of the delegation.
note that at this stage, the mediaservice does not know the decoder type until the actual initialization that may happen in a different process.
the decoder type, in fact, is detected only after initialization in DecoderStreamTraits<>::OnDecoderInitialized and then in DecoderSelector<StreamType>::OnDecoderInitializeDone

in android, for audio, priority is given to ffmpeg (media/filters/ffmpeg_audio_decoder.cc) which parses the source.
The allowed ffmpeg codecs are listed in media/filters/ffmpeg_glue.cc (GetAllowedDemuxers and GetAllowed-XXX-Decoders).
It is necessary to allow the use of codecs in ffmpeg because it may happen that the information provided by DecoderConfigs is not complete and therefore it is necessary for ffmpeg to perform stream probing (and other "cog").
This step is important because in the case of aac and android, ffmpeg fills the AudioDecoderConfig with other information (such as bytes_per_channel) needed by the android mediacodec.

the goal of the patch is to break the normal continuation of the GetAndInitializeNextDecoder code, which would then have used the ffmpeg codec.
Fortunately, it is possible to break it and declare that ffmpeg is unable to decode it once it has actually done the parsing: the code from chromium at this point goes to the next decoder, which in android is mediacodec (media/filters/android/media_codec_audio_decoder.cc).

the android mediacodec, with the information declared in AudioDecoderConfig, is then found to be able to decode the stream, free of charge.
the code flow in fact is:

  • The creation of the mojo pipe in BindRemoteDecoder (mojo_audio_decoder.cc)
  • MediaInterfaceProxy proxy initialization (media_interface_proxy.cc)
  • the creation of the actual CreateAudioDecoder decoder (interface_factory_impl.cc)
  • instantiation of the service in gpu (mojo_audio_decoder_service.cc and gpu_mojo_media_client.cc) and activation in CreatePlatformAudioDecoder and then the java or windows side decoder

the same logic applies to h264 video.

at this point, the flow is already handled by default by chromium: the demuxer passes the source via the mojo pipe, it is decoded to PCM and passed to the output.
In windows it is the same, but of course the decoders are platform specific (media_foundation_audio_decoder.cc): the only peculiarity is that streams from url contain raw aac packets while those from javascript are encoded in adts. FFmpegAACBitstreamConverter takes care of the eventual conversion, as in android.

  • activation of log messages

I also indicate where to activate the various logs, for future reference:

  • renderer: MediaStreamManager::SendMessageToNativeLog
  • media/base/win/mf_helpers.h
  • MediaLog::ShouldLogToDebugConsole
  • media/base/media_util.cc MediaTraceIsEnabled()
  • enable_logging_override gn flag
  • activate ffmpeg log
code

diff --git a/media/base/media.cc b/media/base/media.cc
--- a/media/base/media.cc
+++ b/media/base/media.cc
@@ -28,6 +28,18 @@ extern "C" {
 }
 #endif
 
+#include "base/strings/stringprintf.h"
+
+namespace {
+
+[[maybe_unused]]
+void my_log_callback(void *ptr, int level, const char *fmt, va_list vargs)
+{
+    LOG(INFO) << base::StringPrintV(fmt, vargs);
+}
+
+}
+
 namespace media {
 
 // Media must only be initialized once; use a thread-safe static to do this.
@@ -43,7 +55,12 @@ class MediaInitializer {
     av_get_cpu_flags();
 
     // Disable logging as it interferes with layout tests.
-    av_log_set_level(AV_LOG_QUIET);
+    av_log_set_level(AV_LOG_TRACE);
+    av_log_set_callback(my_log_callback);
 
 #if BUILDFLAG(USE_ALLOCATOR_SHIM)
     // Remove allocation limit from ffmpeg, so calls go down to shim layer.
@@ -81,7 +98,9 @@ void InitializeMediaLibrary() {
 }

@uazo uazo merged commit b228ed7 into master Mar 10, 2024
1 check passed
@uazo uazo deleted the platform_aac_hevc branch March 10, 2024 10:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Enable platform decoder for proprietary_codecs flag on Linux
1 participant