RustAudio / cpal

Cross-platform audio I/O library in pure Rust

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CoreAudio on iOS lists no input devices

tGrothmannFluffy opened this issue · comments

Hi there,

thanks for this awesome crate!

I am running into an issue on iOS (debugging and release, on an iphone and on simulator). The project is a flutter app using the flutter rust bridge. Running this code:

let hosts = cpal::available_hosts();
log_string(format!("Available hosts: {:?}", hosts));

let host = cpal::default_host();
log_string(format!("default host: {:?}", host.id()));

log_string(format!(
    "Host num input devices: {:?}",
    host.input_devices().unwrap().count()
));

log_string(format!(
    "Host num output devices: {:?}",
    host.output_devices().unwrap().count()
));

logs:

Available hosts: [CoreAudio]
default host: CoreAudio
Host num input devices: 0
Host num output devices: 1

I also get this error:

[aurioc] AURemoteIO.cpp:1151 failed: -10851 (enable 1, outf< 2 ch, 0 Hz, Float32, deinterleaved> inf< 2 ch, 0 Hz, Float32, deinterleaved>)


When I try to access host.default_input_device() it results in:
Could not get default input config: BackendSpecific { err: BackendSpecificError { description: "Invalid property value" } }

Same when I try to access device.supported_input_configs().unwrap():
called `Result::unwrap()` on an `Err` value: BackendSpecific { err: BackendSpecificError { description: "Invalid property value" } }


Microphone permissions are granted via the permission handler

update:

When I get the audio devices in Dart using audio_service:

final session = await AudioSession.instance;
List<AudioDevice> audioDevices = (await session.getDevices()).toList();

for (var device in audioDevices) {
  print("Device: ${device.name}, input: ${device.isInput}, type: ${device.type}");
}

the mic is found:

Device: MicrophoneBuiltIn, input: true, type: AudioDeviceType.builtInMic
Device: Speaker, input: false, type: AudioDeviceType.builtInSpeaker

I think this is a Info.plist issue. Through a bit of work, I was able to get Host num input devices: 1 on a device.

I had to add microphone to the list of UIRequiredDeviceCapabilities. I also added NSMicrophoneUsageDescription but I think you may have this one as it's mentioned in permission_handler. Once doing that, the "This app wants to use your microphone" modal popped up.

Note: The same Info.plist doesn't change the result on my iOS simulator (Host num input devices: 0) but I've got other things to work on.

Hope that helps.

Thanks for the investigation!

I added

<key>UIRequiredDeviceCapabilities</key>
<array>
   <string>microphone</string>
</array>

to ios/Runner/Info.plist but unfortunately that didn't fix it.

It might be a permissions issue, but the microphone permission modal pops up and flutter has microphone permissions. Just the rust library it links maybe doesn't 🤔

I'm still investigating this issue and I have some news.
I've tested several versions on simulators and the problem starts with iOS 17.0

iphone  8 on iOS 15.0 - works
iphone 12 on iOS 16.0 - works
iphone 14 on iOS 16.4 - works
iphone 11 on iOS 17.0 - doesn't work
iphone 14 on iOS 17.0 - doesn't work
iphone 15 on iOS 17.2 - doesn't work

@tGrothmannFluffy you may also need to configure and activate AVAudioSession. Set category to play and record

Oh goodness gracious!
Your comment pointed me to the right solution. It worked after I added :

import AVFAudio

...

#if os(iOS)
let audio_session = AVAudioSession.sharedInstance();
do {
    try audio_session.setCategory(AVAudioSession.Category.playAndRecord);
    try audio_session.setActive(true);
} catch {
    print(error);
}
#endif

into AppDelegate.swift.

@tGrothmannFluffy I don't know your app, but if you want play nice with other apps that are currently in background, it is better to activate your audio session on demand and not in AppDelegate app start. Otherwise, for instance, Apple Music will stop playing when your app starts and may be that behavior your users would not like.

Thanks for the help!
Yes, that makes absolute sense. It's a Flutter app (Dart) using a Rust library for audio (via flutter_rust_bridge). It would be best to set the session to active in Rust. Unfortunately I don't know how to activate the session outside of Swift code at the moment.

Ok, turns out using audio_session in Dart/Flutter works:

final session = await AudioSession.instance;
await session.configure(const AudioSessionConfiguration.music().copyWith(
  avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
));
await session.setActive(true);

But I wonder, isn't this something cpal should do when opening a stream?