New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AKOfflineRenderNode doesn't render any sound. #1096
Comments
Arg. I'll look into it. I made the Offline render to address pre iOS 11 offline rendering. Maybe the offline render node needs to be altered to use the new AVAudioEngine offline rendering. |
Well I didn't troubleshoot the issue in your example, but I was able to test the offline render node on my 6s running 11.0.3 and it still works. The test I used was the example iOS project SongProcessor. Keep us in the loop on your progress with this, please. It will be good to pinpoint the OS related changes so we can better generalize and document things. |
@dave234 thank you for investigating the issue. I utilized a code from SongProcessor to build my App. So I've just downloaded the latest source code from Github(4.0.2) and build AudioKit framework from source. Next I run SongProcessor example on iPhone 6 - iOS 11 simulator and get this strange silent file(mixDown.zip) My XCode version: Version 9.0 (9A235). Please let me know if you need any additional details? |
So just to make sure I understand correctly. The silent file was produced from the SongProcessor example? |
Yep, I input 2 as a number of loops and then uploaded result file from: /Users/Yuri/Library/Developer/CoreSimulator/Devices/E508F7BA-A400-4B13-A47B-E6512CBCD431/data/Containers/Data/Application/DFF0F2FC-15C3-4AC2-91B6-520DC4C5ED92/tmp/mixDown.m4a |
Can you SongProcessor on your iPad?
…On Mon, Oct 23, 2017 at 11:53 AM, ygoncharov-ideas ***@***.*** > wrote:
Yep, I input 2 as a number of loops and then uploaded result file from:
/Users/Yuri/Library/Developer/CoreSimulator/Devices/
E508F7BA-A400-4B13-A47B-E6512CBCD431/data/Containers/
Data/Application/DFF0F2FC-15C3-4AC2-91B6-520DC4C5ED92/tmp/mixDown.m4a
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1096 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AEIkSwyFTFlh_A5xSOGTtvWbWSK2wwuWks5svOCPgaJpZM4QC-MK>
.
|
Unfortunately I will be able to run SongProcessor on iPad only tomorrow.
|
Hi, I have a very similar issue on iOS 11 (not tested on iOS 10), but in my case, the file rendered by If I do something like Happy to post some code if that would be helpful. |
I've just tried to increase duration |
In my case, the duration of the silent part relative to the duration of the part with signal seems to vary depending on the length of the original file. When For a shorter file (14 seconds) I have to render |
@dave234, instead of iPad I run SongProcessor on iPhone 6 with iOS 11.0.3(15A432) and then shared result to my Mac via AirDrop. I still got silent file - mixDown-iPhone6.zip Also I disabled "Inter-App Audio" and Background Mode capabilities since it requires to register separate bundle id to run the App on real devices. But I guess it used for playback and communication with iTunes so shouldn't be a root cause of reported problem. |
@dave234, in meanwhile I decided to implement solution based on AVAudioEngine. Details at: https://developer.apple.com/videos/play/wwdc2017/501/ Here is a rough code for Offline Renderer:
Here an example of usage:
I've tested it on simulator and problematic iPhone and got correct processed recordings for both. I found 2 inconveniences with this approach: App should be targeted to iOS 11+ and render process is synchronous. Could you please review my code and help to adjust it into AudioKit architecture because I'm not sure how it should be integrated. Also any help to make a rough code more production like is appreciated. Ideally we can make new feature pull request . |
This is great! Thank you for contributing! There's a bit of an API flaw on my part when implementing the offline render node. I made it for backward compatibility with < iOS 11.0, but didn't create an API where the implementation could be swapped. May thinking was that a separate node would be preferable so that audio could continue to play on the rest of the engine while a single "path" could be manually rendered. This has proven to be a mistake. So the biggest inconsistency with the AVAudioEngine manual rendering and my implementation is the use of a separate node rather than rendering the output node. I think if you are going to implement offline rendering using the new However, if you want to try to make the offlineRender node work >= iOS 11.0, I think a different approach is necessary (hacks). Since I am not able to reproduce the behavior locally on my 6S, do you think you could try something for me? Just after the call |
@dave234, thank you for feedback!
I absolutely agree with you. I will create an extension for AudioKit class.
Please don’t rush. Something “mystical” happens around offline render. When I started to develop my App I had same problem for my iPhone 6s.
I understood idea of your hack. I will check it on problematic iPhone 6 and let you know results.
Are you sure? |
I tried to enable manual rendering mode for SongProcessor and tested on iPhone 6s on iOS 11.
I saw following error: Next I added engine stop/start methods:
But it failed on Do you have other scenarios to try? |
Thanks so much for helping with this.
My thinking is that if >= iOS 11, we use the new iOS 11 AVAudioEngine manualRender under the hood, and if < iOS 11, we use the (previously working) node approach, but devise a common interface to both of those methods (i.e. AudioKit.renderToURL(url, seconds: 123.45) ).
In AKOfflineRenderNode.m, could you try setting pullFlags to kAudioOfflineUnitRenderAction_Render? Sort of a long shot, but maybe worth a try. |
@dave234 sorry for a delay. As you recommended I set:
Unfortunately it didn't give any positive result. SongProcessor's recording is still silent. |
@dave234 do you have any updates or fixes for code to try? |
@dave234 in meanwhile I tried to test SongProcessor with XCode 9.1(9B55) on iOS 11.1. Unfortunately, result is same. I still get silent audio. Also I tested solution based on AVAudioEngine's enableManualRenderingMode but it doesn't work as expected. I can render(export) recording only once. So when I start new recording then AKNodeRecorder doesn't record any sound and I don't see any error in log. It looks like there is no possible way to render offline sound on iOS 11. |
Ok I've got a working offline render function implemented as AudioKit.renderToFile, and I've deprecated AKOfflineRenderNode. Thanks for finding this. |
I do think it is handy to have the offline rendering NOT related to the main audio path. What if I want to render something different offline in the background? |
@eljeff I’m with you on this. It’s a bit of work though. Currently AudioKit uses a global AVAudioEngine, I believe that we will need to make all audiokit functions pass around an audioEngine if we want to achieve this. We could default to a global engine if audioEngine argument is missing. I also think that it’s totally worth it to do this. Not only will we have more control over offline rendering; we would also gain the ability to let AudioKit act as a “slave” engine. For my projects it’s worth it to bounce tracks without stopping playback. I have some thoughts on how to do this, I’d love to hear yours. |
I actually figured out a hack.... Seems to work! |
I'm curious; do you know if this would work with playback from AKSequencer (i.e., MIDI)? |
It will be good if you share demo project with recorded voice + reverb effect and then save the filtered file and then share it. |
No, you can't right now but we're working on it! |
Except the share part, this is what RecorderDemo does. |
Hi. Thanks for your response, it worked. Currently i again need your help. i have to add echo effect. in swift i tested few codes we use this to add echo
but how we can add echo in our audiokit? |
Reporting that I have the same problem with SongProcessor demo. When I export default app generated guitar samples, it turns out to be a silent file. Built it from Xcode 9.4.1 into an Ipad Mini with iOS 9.3. I'm trying it because I can't make offline render to work (I need to support iOS < 11). Please, can anyone dealing with same problem shed some light on how to proceed? |
Is there any progress to report on offline rendering of MIDI? |
I utilized AKOfflineRenderNode to process recorded voice. The idea is to record voice and save it into the file. Next apply some effects to this file and save combined voice+effects into file.
I have successful result on my iPhone 6s running on iOS 11.0.3 but on other devices running on iOS 11+(for example last generation of iPad on 11.0.3) I always have "silent" file with same size about 60 Kb. I attached one for reference(rec.zip). Also the issue 100% reproducible on iOS 11 simulator.
Here is my setup(I removed effects setup since offline render doesn't work even on clean recording):
Here is export method:
The text was updated successfully, but these errors were encountered: