AudioKit / AudioKit

Audio synthesis, processing, & analysis platform for iOS, macOS and tvOS

Home Page:http://audiokit.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

input bus 0 sample rate is 0

Samigos opened this issue · comments

macOS Version(s) Used to Build

macOS 13.5 Ventura

Xcode Version(s)

Xcode 15.0.1

AudioKit Version

5.6.1

Description

I have a user that claims every time she tries to record her voice using Lightning earphones, the recording is always silent. Right now, I finally got myself in in the same situation. Although I am not using any earphones.

Here's my code;

import SwiftUI
import AudioKit
import AudioKitEX
import SoundpipeAudioKit
import AVFoundation

class PitchCorrectingRecorderAudioKitService{
    private let engine = AudioEngine()
    private var pitchShiftEffect: PitchShifter!
    private var silencer: Fader!
    private let mixer = Mixer()
    
    // ------------------
    
    private var unprocessedAudioFile: AVAudioFile?
    
    private let audioSettings: [String : Any] = [
        AVFormatIDKey: Int(kAudioFormatLinearPCM),
        AVSampleRateKey: 44_100,
        AVNumberOfChannelsKey: 2,
        AVLinearPCMIsBigEndianKey: 0, 
        AVLinearPCMBitDepthKey: 16,
        AVLinearPCMIsNonInterleaved: 1,
        AVLinearPCMIsFloatKey: 1,
        AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
    ]
    
    // ------------------
    
    init() {
        initialize()
    }
    
    private func initialize() {
        guard let input = engine.input else { return }
        
        pitchShiftEffect = PitchShifter(input)
        
        silencer = Fader(pitchShiftEffect, gain: 0)
        mixer.addInput(silencer)
        
        engine.output = mixer
    }
    
    private func setUpUnprocessedAudioFile() {
        let outputURL = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("unprocessedAudioFile\(Date().timeIntervalSince1970)").appendingPathExtension("caf")
        unprocessedAudioFile = try! AVAudioFile(forWriting: outputURL, settings: audioSettings)
    }
    
    func start(baseNote: Note) {
        setUpUnprocessedAudioFile()
        
        // ------------------
        
        engine.input?.avAudioNode.installTap(onBus: 0, bufferSize: 4096, format: nil) { [weak self] buffer, time in
            try! self?.unprocessedAudioFile?.write(from: buffer)
        }
        
        // ------------------
        
        do {
            try engine.start()
        } catch {
        }
    }
}

Here are the logs:

[aurioc]            AURemoteIO.cpp:1139  failed: -10851 (enable 1, outf< 2 ch,      0 Hz, Float32, deinterleaved> inf< 2 ch,      0 Hz, Float32, deinterleaved>)
[mcmx]  AUMultiChannelMixer3.cpp:344   input bus 0 sample rate is 0
2023-10-25 22:44:51.848915+0300 popster[3376:406286] [avae]            AVAEInternal.h:109   [AVAudioEngineGraph.mm:1389:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875

For some reason, it just keeps happening to me for the first time! Any idea what's going on?

Do you get the same result using the Cookbook's recorder example?
https://github.com/AudioKit/Cookbook/blob/main/Cookbook/CookbookCommon/Sources/CookbookCommon/Recipes/MiniApps/Recorder.swift

There is also an input selection recipe that might be helpful.
https://github.com/AudioKit/Cookbook/blob/main/Cookbook/CookbookCommon/Sources/CookbookCommon/Recipes/WIP/ChannelDeviceRouting.swift

In short, I'm not sure what the issue is, but here a couple things to rule out:

  1. Update to using the AudioKit main branch. There was an update to the taps in that version (moving an async task) that might help.

  2. Weak symbols are having issues with Xcode 15's Linker. It might help adding -Wl,-ld_classic to the OTHER_LDFLAGS. https://developer.apple.com/documentation/xcode-release-notes/xcode-15-release-notes#Linking

Other than that, if you can make an example project with steps to reproduce it would help in debugging.

Also make sure you set the AVAudioSession to .playAndRecord in your app delegate like in the Cookbook.

            do {
                Settings.bufferLength = .short
                try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(Settings.bufferLength.duration)
                try AVAudioSession.sharedInstance().setCategory(.playAndRecord,
                                                                options: [.defaultToSpeaker, .mixWithOthers, .allowBluetoothA2DP])
                try AVAudioSession.sharedInstance().setActive(true)
            } catch let err {
                print(err)
            }