Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CoreAudio on iOS lists no input devices #842

Closed
tGrothmannFluffy opened this issue Feb 20, 2024 · 9 comments
Closed

CoreAudio on iOS lists no input devices #842

tGrothmannFluffy opened this issue Feb 20, 2024 · 9 comments

Comments

@tGrothmannFluffy
Copy link

tGrothmannFluffy commented Feb 20, 2024

Hi there,

thanks for this awesome crate!

I am running into an issue on iOS (debugging and release, on an iphone and on simulator). The project is a flutter app using the flutter rust bridge. Running this code:

let hosts = cpal::available_hosts();
log_string(format!("Available hosts: {:?}", hosts));

let host = cpal::default_host();
log_string(format!("default host: {:?}", host.id()));

log_string(format!(
    "Host num input devices: {:?}",
    host.input_devices().unwrap().count()
));

log_string(format!(
    "Host num output devices: {:?}",
    host.output_devices().unwrap().count()
));

logs:

Available hosts: [CoreAudio]
default host: CoreAudio
Host num input devices: 0
Host num output devices: 1

I also get this error:

[aurioc] AURemoteIO.cpp:1151 failed: -10851 (enable 1, outf< 2 ch, 0 Hz, Float32, deinterleaved> inf< 2 ch, 0 Hz, Float32, deinterleaved>)


When I try to access host.default_input_device() it results in:
Could not get default input config: BackendSpecific { err: BackendSpecificError { description: "Invalid property value" } }

Same when I try to access device.supported_input_configs().unwrap():
called `Result::unwrap()` on an `Err` value: BackendSpecific { err: BackendSpecificError { description: "Invalid property value" } }


Microphone permissions are granted via the permission handler

@tGrothmannFluffy
Copy link
Author

update:

When I get the audio devices in Dart using audio_service:

final session = await AudioSession.instance;
List<AudioDevice> audioDevices = (await session.getDevices()).toList();

for (var device in audioDevices) {
  print("Device: ${device.name}, input: ${device.isInput}, type: ${device.type}");
}

the mic is found:

Device: MicrophoneBuiltIn, input: true, type: AudioDeviceType.builtInMic
Device: Speaker, input: false, type: AudioDeviceType.builtInSpeaker

@tGrothmannFluffy tGrothmannFluffy changed the title CoreAudio on iOS (running on phone) lists no input devices CoreAudio on iOS lists no input devices Feb 21, 2024
@simlay
Copy link
Member

simlay commented Feb 21, 2024

I think this is a Info.plist issue. Through a bit of work, I was able to get Host num input devices: 1 on a device.

I had to add microphone to the list of UIRequiredDeviceCapabilities. I also added NSMicrophoneUsageDescription but I think you may have this one as it's mentioned in permission_handler. Once doing that, the "This app wants to use your microphone" modal popped up.

Note: The same Info.plist doesn't change the result on my iOS simulator (Host num input devices: 0) but I've got other things to work on.

Hope that helps.

@tGrothmannFluffy
Copy link
Author

Thanks for the investigation!

I added

<key>UIRequiredDeviceCapabilities</key>
<array>
   <string>microphone</string>
</array>

to ios/Runner/Info.plist but unfortunately that didn't fix it.

It might be a permissions issue, but the microphone permission modal pops up and flutter has microphone permissions. Just the rust library it links maybe doesn't 🤔

@tGrothmannFluffy
Copy link
Author

I'm still investigating this issue and I have some news.
I've tested several versions on simulators and the problem starts with iOS 17.0

iphone  8 on iOS 15.0 - works
iphone 12 on iOS 16.0 - works
iphone 14 on iOS 16.4 - works
iphone 11 on iOS 17.0 - doesn't work
iphone 14 on iOS 17.0 - doesn't work
iphone 15 on iOS 17.2 - doesn't work

@yury
Copy link

yury commented Feb 27, 2024

@tGrothmannFluffy you may also need to configure and activate AVAudioSession. Set category to play and record

@tGrothmannFluffy
Copy link
Author

Oh goodness gracious!
Your comment pointed me to the right solution. It worked after I added :

import AVFAudio

...

#if os(iOS)
let audio_session = AVAudioSession.sharedInstance();
do {
    try audio_session.setCategory(AVAudioSession.Category.playAndRecord);
    try audio_session.setActive(true);
} catch {
    print(error);
}
#endif

into AppDelegate.swift.

@yury
Copy link

yury commented Feb 27, 2024

@tGrothmannFluffy I don't know your app, but if you want play nice with other apps that are currently in background, it is better to activate your audio session on demand and not in AppDelegate app start. Otherwise, for instance, Apple Music will stop playing when your app starts and may be that behavior your users would not like.

@tGrothmannFluffy
Copy link
Author

Thanks for the help!
Yes, that makes absolute sense. It's a Flutter app (Dart) using a Rust library for audio (via flutter_rust_bridge). It would be best to set the session to active in Rust. Unfortunately I don't know how to activate the session outside of Swift code at the moment.

@tGrothmannFluffy
Copy link
Author

tGrothmannFluffy commented Feb 27, 2024

Ok, turns out using audio_session in Dart/Flutter works:

final session = await AudioSession.instance;
await session.configure(const AudioSessionConfiguration.music().copyWith(
  avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
));
await session.setActive(true);

But I wonder, isn't this something cpal should do when opening a stream?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants