Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AudioPlayer buffer samplerate mismatch #2794

Open
renezuidhof opened this issue Nov 1, 2022 · 10 comments
Open

AudioPlayer buffer samplerate mismatch #2794

renezuidhof opened this issue Nov 1, 2022 · 10 comments
Labels

Comments

@renezuidhof
Copy link

renezuidhof commented Nov 1, 2022

macOS Version(s) Used to Build

macOS 12 Monterey

Xcode Version(s)

Xcode 14

Description

The samplerate of the outputFormat of playerNode in an AudioPlayer does not seem to match with the buffer that is being used. I don't have any issues when using a non-buffered AudioPlayer

func load(_ audioFile: AVAudioFile) {
  try! audioPlayer.load(file: file, buffered: true)
}

This results in slowed down audio because the outputFormat of the playerNode is 44100 while the file has a samplerate of 48000 (on my device, in the simulator it works fine).

In AVFAudio.AVAudioPlayerNode there's a comment mentioning: When playing buffers, there is an implicit assumption that the buffers are at the same sample rate as the node's output format.

I've read somewhere that setting the audioFormat fixes these kind of issues but it does not work.

Settings.audioFormat = AVAudioFormat(standardFormatWithSampleRate: 48000, channels: 2) ?? AVAudioFormat()

I noticed there's a check on format in the AudioPlayer:
https://github.com/AudioKit/AudioKit/blob/main/Sources/AudioKit/Nodes/Playback/AudioPlayer/AudioPlayer.swift#L257

But when loading the first file, this part will not be checked. Therefor the makeInternalConnections function on line 273 will not be called.

I've created an extension function of the AudioPlayer where I choose to reconnect the AudioPlayer. This solves the issue.

extension AudioPlayer {
    public func reconnectPlayerNode() {
        let engine = mixerNode.engine!
        
        engine.disconnectNodeInput(playerNode)
        
        if playerNode.engine == nil {
            engine.attach(playerNode)
        }
        
        engine.connect(playerNode, to: mixerNode, format: file?.processingFormat)
    }
}

Is there something I'm missing so I don't have to reconnect? Or is this the way to go for now?

Crash Logs, Screenshots or Other Attachments (if applicable)

No response

@renezuidhof renezuidhof added the bug label Nov 1, 2022
@aure
Copy link
Member

aure commented Nov 11, 2022

Your solution does not seem bad, though it would be nice if it just worked. If you come up with a change that fixes it from within AudioKit, make a PR and I'll release a version with the fix.

@renezuidhof
Copy link
Author

The current AudioPlayer is full of if's for buffered and non-buffered behaviour. Also unexpected behaviour as mentioned here: #2788

I am thinking about writing my own BufferedAudioPlayer. So one AudioPlayer for 'normal' playback and one for buffered playback. This might fix issues and might prevent any issues from existing in the future.

Any thought on this being changed for AudioKit? If so I can see if I can get some time to make it and push a PR.

@aure
Copy link
Member

aure commented Nov 13, 2022

I think breaking it up into its own thing is a great idea, would welcome a PR to this end.

@themailman05
Copy link

@renezuidhof Happy to collaborate with you on this, I would like a BufferedAudioPlayer too. Thank you for the workaround!

@renezuidhof
Copy link
Author

renezuidhof commented Jan 8, 2023

I actually only just started on creating a buffered audioplayer for my own application. I hope to move this to AudioKit when it is done.

So far I could work with the current AudioPlayer available in AudioKit. For a new feature I need offline rendering, this gives me some problems since I cannot use the seek function and the workaround I posted here (#2788) only works with a specific point in time.

I'm not sure how long it will take, but hopefully I can get around to it soon!

EDIT:
(Sorry for mixing two issues)
I just read this in the AVAudioPlayerNode comments. So I guess my problem could be fixed using the lastRenderTime.

OFFLINE RENDERING
When a player node is used with the engine operating in the manual rendering mode, the
buffer/file completion handlers, `lastRenderTime` and the latencies (`latency` and
`outputPresentationLatency`) can be used to track how much data the player has rendered and
how much more data is left to render.

@emurray2
Copy link
Member

I actually only just started on creating a buffered audioplayer for my own application. I hope to move this to AudioKit when it is done.

So far I could work with the current AudioPlayer available in AudioKit. For a new feature I need offline rendering, this gives me some problems since I cannot use the seek function and the workaround I posted here (#2788) only works with a specific point in time.

I'm not sure how long it will take, but hopefully I can get around to it soon!

EDIT:

(Sorry for mixing two issues)

I just read this in the AVAudioPlayerNode comments. So I guess my problem could be fixed using the lastRenderTime.


OFFLINE RENDERING

When a player node is used with the engine operating in the manual rendering mode, the

buffer/file completion handlers, `lastRenderTime` and the latencies (`latency` and

`outputPresentationLatency`) can be used to track how much data the player has rendered and

how much more data is left to render.

It's all good. You might want to take a look at the v6 branch of AudioKit. AudioPlayer was removed, but I'm sure your addition of BufferedAudioPlayer will be greatly appreciated, so long as it doesn't use AVAudioEngine. I think the new branch is starting to avoid using it.

@renezuidhof
Copy link
Author

Thanks for the update! I just checked out the v6 branch. It's goals like Parallel Audio Rendering and getting rid of AVAudioEngine look promising!

What would be the alternative for AudioPlayer in v6? Is it something that still need to be created (like a BufferedAudioPlayer)? I see a Sampler but I don't think it is intended as a normal audioplayer with seeking and edit functionality.

I was planning on creating the BufferedAudioPlayer with the AVAudioEngine but I'll abort that for now and use the existing AudioPlayer. In the meantime I will check v6 to get up to speed and hopefully create something like a BufferedAudioPlayer at some point.

@emurray2
Copy link
Member

We could create our own audio player audio unit. You can still use things like AVAudioPCMBuffer to store the buffer I think.

@emurray2
Copy link
Member

Update: It sounds like Sampler is going to be the new audio player in v6, so we should peek the code of that and maybe put our efforts there if needed. If you want a quick patch for v5 though, I'd suggest using your current BufferedAudioPlayer if it works how you want it to.

@NickCulbertson
Copy link
Member

Setting the Audio Player's initial file/buffer after the engine's output is set seems to make the format update correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants