Skip to content

Commit

Permalink
[AVFoundation] Fix delegate signature for AVAudioSourceNode callbacks. (
Browse files Browse the repository at this point in the history
#20357)

The signature of the callback provided when creating AVAudioSourceNode instances
was wrong, so the callback would get corrupted data, causing crashes.

Fix the callback to have the right signature (not too difficult), and also make the
existing signature work (a bit more complicated).

Note that this is the second time we've tried to fix the delegate signature for the
AVAudioSourceNode callbacks, but now with tests, so hoping there won't be a third
time.

Fixes #19868.
  • Loading branch information
rolfbjarne committed Apr 9, 2024
1 parent 9e7a6ea commit dc4b11f
Show file tree
Hide file tree
Showing 7 changed files with 345 additions and 18 deletions.
165 changes: 165 additions & 0 deletions src/AVFoundation/AVAudioSourceNode.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,165 @@
using System;
using System.ComponentModel;
using System.Runtime.CompilerServices;

using AudioToolbox;
using ObjCRuntime;

namespace AVFoundation {

// This is messy...
// * The good delegate is:
// - AVAudioSourceNodeRenderHandler in XAMCORE_5_0
// - AVAudioSourceNodeRenderHandler3 otherwise
// * There are two legacy delegates:
// - AVAudioSourceNodeRenderHandler in .NET and legacy Xamarin
// - AVAudioSourceNodeRenderHandler2 in legacy Xamarin.
//
// History of mistakes:
//
// 1. We first bound this using AVAudioSourceNodeRenderHandler (legacy Xamarin version), which was wrong.
// 2. We then found the mistake, and bound it as AVAudioSourceNodeRenderHandler2 in legacy Xamarin, removed the initial version in .NET, and named it AVAudioSourceNodeRenderHandler in .NET.
// * However, we failed to write tests, so this delegate was broken too.
// 3. Finally we got a customer report, and realized the new delegate was broken too. So now there are two broken delegates, and one working (AVAudioSourceNodeRenderHandler3 in .NET and legacy Xamarin, named as AVAudioSourceNodeRenderHandler in XAMCORE_5_0).
// * And tests were added too.
//
// Note: broken = made to work with a workaround, which makes this even messier.
//

/// <summary>The delegate that will be called in a callback from <see cref="T:AudioToolbox.AVAudioSourceNode" />.</summary>
/// <returns>An OSStatus result code. Return 0 to indicate success.</returns>
/// <param name="isSilence">Indicates whether the supplied audio data only contains silence.</param>
/// <param name="timestamp">The timestamp the audio renders (HAL time).</param>
/// <param name="frameCount">The number of frames of audio to supply.</param>
/// <param name="outputData">The <see cref="T:AudioToolbox.AudioBuffers" /> that contains the supplied audio data when the callback returns.</param>
#if XAMCORE_5_0
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, AudioBuffers outputData);
#else
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler3 (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, AudioBuffers outputData);
#endif

#if !XAMCORE_5_0
#if NET
[EditorBrowsable (EditorBrowsableState.Never)]
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
#else
[EditorBrowsable (EditorBrowsableState.Never)]
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (bool isSilence, AudioToolbox.AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
[EditorBrowsable (EditorBrowsableState.Never)]
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler2 (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
#endif // NET
#endif // XAMCORE_5_0

public partial class AVAudioSourceNode {
#if !XAMCORE_5_0 && NET
[EditorBrowsable (EditorBrowsableState.Never)]
[Obsolete ("Use the overload that takes a delegate that does not take a 'ref AudioBuffers' instead. Assigning a value to the 'inputData' parameter in the callback has no effect.")]
public AVAudioSourceNode (AVAudioSourceNodeRenderHandler renderHandler)
: this (GetHandler (renderHandler))
{
}

[EditorBrowsable (EditorBrowsableState.Never)]
[Obsolete ("Use the overload that takes a delegate that does not take a 'ref AudioBuffers' instead. Assigning a value to the 'inputData' parameter in the callback has no effect.")]
public AVAudioSourceNode (AVAudioFormat format, AVAudioSourceNodeRenderHandler renderHandler)
: this (format, GetHandler (renderHandler))
{
}
#endif // !XAMCORE_5_0

#if !NET
[EditorBrowsable (EditorBrowsableState.Never)]
[Obsolete ("Use the overload that takes a delegate that does not take a 'ref AudioBuffers' instead. Assigning a value to the 'inputData' parameter in the callback has no effect.")]
public AVAudioSourceNode (AVAudioSourceNodeRenderHandler2 renderHandler)
: this (GetHandler (renderHandler))
{
}

[EditorBrowsable (EditorBrowsableState.Never)]
[Obsolete ("Use the overload that takes a delegate that does not take a 'ref AudioBuffers' instead. Assigning a value to the 'inputData' parameter in the callback has no effect.")]
public AVAudioSourceNode (AVAudioFormat format, AVAudioSourceNodeRenderHandler2 renderHandler)
: this (format, GetHandler (renderHandler))
{
}
#endif // !NET

/// <summary>Creates an <see cref="T:AudioToolbox.AVAudioSourceNode" /> with the specified callback to render audio.</summary>
/// <param name="renderHandler">The callback that will be called to supply audio data.</param>
#if XAMCORE_5_0
public AVAudioSourceNode (AVAudioSourceNodeRenderHandler renderHandler)
#else
public AVAudioSourceNode (AVAudioSourceNodeRenderHandler3 renderHandler)
#endif // XAMCORE_5_0
: this (GetHandler (renderHandler))
{
}

/// <summary>Creates an <see cref="T:AudioToolbox.AVAudioSourceNode" /> with the specified callback to render audio.</summary>
/// <param name="format">The format of the PCM audio data the callback supplies.</param>
/// <param name="renderHandler">The callback that will be called to supply audio data.</param>
#if XAMCORE_5_0
public AVAudioSourceNode (AVAudioFormat format, AVAudioSourceNodeRenderHandler renderHandler)
#else
public AVAudioSourceNode (AVAudioFormat format, AVAudioSourceNodeRenderHandler3 renderHandler)
#endif // XAMCORE_5_0
: this (format, GetHandler (renderHandler))
{
}

#if !NET
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler renderHandler)
{
AVAudioSourceNodeRenderHandlerRaw rv = (IntPtr isSilence, IntPtr timestamp, uint frameCount, IntPtr outputData) => {
unsafe {
byte* isSilencePtr = (byte*) isSilence;
bool isSilenceBool = (*isSilencePtr) != 0;
AudioTimeStamp timestampValue = *(AudioTimeStamp*) timestamp;
var buffers = new AudioBuffers (outputData);
return renderHandler (isSilenceBool, timestampValue, frameCount, ref buffers);
}
};
return rv;
}
#endif // !NET

#if !XAMCORE_5_0
#if NET
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler renderHandler)
#else
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler2 renderHandler)
#endif // NET
{
AVAudioSourceNodeRenderHandlerRaw rv = (IntPtr isSilence, IntPtr timestamp, uint frameCount, IntPtr outputData) => {
unsafe {
byte* isSilencePtr = (byte*) isSilence;
bool isSilenceBool = (*isSilencePtr) != 0;
var buffers = new AudioBuffers (outputData);
var rv = renderHandler (ref isSilenceBool, ref Unsafe.AsRef<AudioTimeStamp> ((void*) timestamp), frameCount, ref buffers);
*isSilencePtr = isSilenceBool.AsByte ();
return rv;
}
};
return rv;
}
#endif // !XAMCORE_5_0

#if XAMCORE_5_0
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler renderHandler)
#else
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler3 renderHandler)
{
#endif // !XAMCORE_5_0
AVAudioSourceNodeRenderHandlerRaw rv = (IntPtr isSilence, IntPtr timestamp, uint frameCount, IntPtr outputData) => {
unsafe {
byte* isSilencePtr = (byte*) isSilence;
bool isSilenceBool = (*isSilencePtr) != 0;
var buffers = new AudioBuffers (outputData);
var rv = renderHandler (ref isSilenceBool, ref Unsafe.AsRef<AudioTimeStamp> ((void*) timestamp), frameCount, buffers);
*isSilencePtr = isSilenceBool.AsByte ();
return rv;
}
};
return rv;
}
}
}
2 changes: 0 additions & 2 deletions src/AVFoundation/AVCompat.cs
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,6 @@
#nullable enable

namespace AVFoundation {
public delegate int AVAudioSourceNodeRenderHandler (bool isSilence, AudioToolbox.AudioTimeStamp timestamp, uint frameCount, ref AudioToolbox.AudioBuffers outputData);

partial class AVAudioNode {
internal AVAudioNode () { }
}
Expand Down
32 changes: 16 additions & 16 deletions src/avfoundation.cs
Original file line number Diff line number Diff line change
Expand Up @@ -15844,32 +15844,32 @@ interface AVCompositionTrackFormatDescriptionReplacement : NSSecureCoding {
[Export ("replacementFormatDescription")]
CMFormatDescription ReplacementFormatDescription { get; }
}
#if NET
delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
#else
delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (bool isSilence, AudioToolbox.AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler2 (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
#endif

/// <summary>The delegate that will be called in a callback from <see cref="T:AudioToolbox.AVAudioSourceNode" />.</summary>
/// <returns>An OSStatus result code. Return 0 to indicate success.</returns>
/// <param name="isSilence">Indicates whether the supplied audio data only contains silence. This is a pointer to a <see cref="T:System.Byte" /> value.</param>
/// <param name="timestamp">The timestamp the audio renders (HAL time). This is a pointer to an <see cref="T:AudioToolbox.AudioTimeStamp" /> value.</param>
/// <param name="frameCount">The number of frames of audio to supply.</param>
/// <param name="outputData">The <see cref="T:AudioToolbox.AudioBuffers" /> that contains the supplied audio data when the callback returns. This is a handle for an <see cref="T:AudioToolbox.AudioBuffers" /> value.</param>
delegate /* OSStatus */ int AVAudioSourceNodeRenderHandlerRaw (IntPtr isSilence, IntPtr timestamp, uint frameCount, IntPtr outputData);

[Watch (6, 0), TV (13, 0), iOS (13, 0)]
[MacCatalyst (13, 1)]
[BaseType (typeof (AVAudioNode))]
[DisableDefaultCtor]
interface AVAudioSourceNode : AVAudioMixing {
/// <summary>Creates an <see cref="T:AudioToolbox.AVAudioSourceNode" /> with the specified callback to render audio.</summary>
/// <param name="renderHandler">The callback that will be called to supply audio data.</param>
[Export ("initWithRenderBlock:")]
[DesignatedInitializer]
#if NET
NativeHandle Constructor (AVAudioSourceNodeRenderHandler renderHandler);
#else
NativeHandle Constructor (AVAudioSourceNodeRenderHandler2 renderHandler);
#endif
NativeHandle Constructor (AVAudioSourceNodeRenderHandlerRaw renderHandler);

/// <summary>Creates an <see cref="T:AudioToolbox.AVAudioSourceNode" /> with the specified callback to render audio.</summary>
/// <param name="format">The format of the PCM audio data the callback supplies.</param>
/// <param name="renderHandler">The callback that will be called to supply audio data.</param>
[Export ("initWithFormat:renderBlock:")]
[DesignatedInitializer]
#if NET
NativeHandle Constructor (AVAudioFormat format, AVAudioSourceNodeRenderHandler renderHandler);
#else
NativeHandle Constructor (AVAudioFormat format, AVAudioSourceNodeRenderHandler2 renderHandler);
#endif
NativeHandle Constructor (AVAudioFormat format, AVAudioSourceNodeRenderHandlerRaw renderHandler);
}

delegate int AVAudioSinkNodeReceiverHandlerRaw (IntPtr timestamp, uint frameCount, IntPtr inputData);
Expand Down
2 changes: 2 additions & 0 deletions src/bgen/Generator.cs
Original file line number Diff line number Diff line change
Expand Up @@ -4538,6 +4538,8 @@ void RenderDelegates (Dictionary<string, MethodInfo> delegateTypes)
if (shortName.StartsWith ("Func<", StringComparison.Ordinal))
continue;

WriteDocumentation (mi.DeclaringType);

var del = mi.DeclaringType;

if (AttributeManager.HasAttribute (mi.DeclaringType, "MonoNativeFunctionWrapper"))
Expand Down
1 change: 1 addition & 0 deletions src/frameworks.sources
Original file line number Diff line number Diff line change
Expand Up @@ -267,6 +267,7 @@ AVFOUNDATION_SOURCES = \
AVFoundation/AVAudioSessionDataSourceDescription.cs \
AVFoundation/AVAudioSessionPortDescription.cs \
AVFoundation/AVAudioSinkNode.cs \
AVFoundation/AVAudioSourceNode.cs \
AVFoundation/AVCaptureConnection.cs \
AVFoundation/AVCaptureDeviceDiscoverySession.cs \
AVFoundation/AVCaptureDeviceInput.cs \
Expand Down
8 changes: 8 additions & 0 deletions tests/cecil-tests/Documentation.KnownFailures.txt
Original file line number Diff line number Diff line change
Expand Up @@ -40122,6 +40122,14 @@ M:AVFoundation.AVAudioSourceNodeRenderHandler.#ctor(System.Object,System.IntPtr)
M:AVFoundation.AVAudioSourceNodeRenderHandler.BeginInvoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.UInt32,AudioToolbox.AudioBuffers@,System.AsyncCallback,System.Object)
M:AVFoundation.AVAudioSourceNodeRenderHandler.EndInvoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,AudioToolbox.AudioBuffers@,System.IAsyncResult)
M:AVFoundation.AVAudioSourceNodeRenderHandler.Invoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.UInt32,AudioToolbox.AudioBuffers@)
M:AVFoundation.AVAudioSourceNodeRenderHandler3.#ctor(System.Object,System.IntPtr)
M:AVFoundation.AVAudioSourceNodeRenderHandler3.BeginInvoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.UInt32,AudioToolbox.AudioBuffers,System.AsyncCallback,System.Object)
M:AVFoundation.AVAudioSourceNodeRenderHandler3.EndInvoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.IAsyncResult)
M:AVFoundation.AVAudioSourceNodeRenderHandler3.Invoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.UInt32,AudioToolbox.AudioBuffers)
M:AVFoundation.AVAudioSourceNodeRenderHandlerRaw.#ctor(System.Object,System.IntPtr)
M:AVFoundation.AVAudioSourceNodeRenderHandlerRaw.BeginInvoke(System.IntPtr,System.IntPtr,System.UInt32,System.IntPtr,System.AsyncCallback,System.Object)
M:AVFoundation.AVAudioSourceNodeRenderHandlerRaw.EndInvoke(System.IAsyncResult)
M:AVFoundation.AVAudioSourceNodeRenderHandlerRaw.Invoke(System.IntPtr,System.IntPtr,System.UInt32,System.IntPtr)
M:AVFoundation.AVAudioStereoMixing.#ctor
M:AVFoundation.AVAudioStereoMixing.#ctor(Foundation.NSObjectFlag)
M:AVFoundation.AVAudioStereoMixing.#ctor(ObjCRuntime.NativeHandle)
Expand Down
Loading

3 comments on commit dc4b11f

@vs-mobiletools-engineering-service2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💻 [CI Build] Tests on macOS X64 - Mac Sonoma (14) passed 💻

All tests on macOS X64 - Mac Sonoma (14) passed.

Pipeline on Agent
Hash: [CI build]

@vs-mobiletools-engineering-service2

This comment was marked as outdated.

@vs-mobiletools-engineering-service2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ API diff for current PR / commit

Legacy Xamarin (No breaking changes)
.NET (No breaking changes)

✅ API diff vs stable

Legacy Xamarin (No breaking changes)
.NET (No breaking changes)
Legacy Xamarin (stable) vs .NET

ℹ️ Generator diff

Generator Diff: vsdrops (html) vsdrops (raw diff) gist (raw diff) - Please review changes)

Pipeline on Agent
Hash: dc4b11fb3cfdccee99e13f6b962ba348c292d3af [CI build]

Please sign in to comment.