AudioKit / AudioKit

Audio synthesis, processing, & analysis platform for iOS, macOS and tvOS

Home Page:http://audiokit.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unable to use renderToFile to export the output of AudioPlayer

Samigos opened this issue · comments

macOS Version(s) Used to Build

macOS 13.5 Ventura

Xcode Version(s)

Xcode 15.0.1

Description

I've built a new view on top of the Cookbook project, trying to record from the mic and then play the recording using multiple, adjustable effects (delay, reverb, pitch shift etc). All that works great!

Now, I want to produce an audio file with the effects! I've been trying for days to export the output of the AudioEngine that has the AudioPlayer attached, and I haven't succeeded. I asked and searched on SO, but I didn't figure anything out. I'm completely lost! I either get [general] AudioPlayer+Playback.swift:play(from:to:at:completionCallbackType:):24:🛑 Error: AudioPlayer's engine must be running before playback. (AudioPlayer+Playback.swift:play(from:to:at:completionCallbackType:):24) or nothing at all.

Any ideas?

import AudioKit
import AudioKitEX
import AudioKitUI
import AVFoundation
import SwiftUI
import SoundpipeAudioKit
import DunneAudioKit

class RecorderConductor2: ObservableObject {
    let recorderEngine = AudioEngine()
    
    let playerEngine = AudioEngine()
    let player = AudioPlayer()
    
    let playerEngine2 = AudioEngine()
    let player2 = AudioPlayer()
    
    var recorder: NodeRecorder?
    var silencer: Fader?
    
    @Published var pitchShifter: PitchShifter!
    @Published var reverb: Reverb!
    
    @Published var data = RecorderData() {
        didSet {
            if data.isRecording {
                do {
                    try recorder?.record()
                } catch let err {
                    print(err)
                }
            } else {
                recorder?.stop()
            }
            
            if data.isPlaying {
                if let file = recorder?.audioFile {
                    try? player.load(file: file)
                    player.play()
                }
            } else {
                player.stop()
            }
        }
    }
    
    func play(url: URL) {
        playerEngine2.output = player2
        try! playerEngine2.start()
        
        try! player2.load(file: try! .init(forReading: url))
        player2.play()
    }

    init() {
        guard let input = recorderEngine.input else {
            fatalError()
        }
        
        do {
            recorder = try NodeRecorder(node: input)
            player.isLooping = true
            silencer = Fader(input, gain: 0)
            recorderEngine.output = silencer
        } catch let err {
            fatalError("\(err)")
        }
        
        applyFiltersToPlayer()
    }
    
    private func applyFiltersToPlayer() {
        pitchShifter = PitchShifter(player)

        reverb = Reverb(pitchShifter)
        reverb.dryWetMix = 0 // Adjust as needed
        reverb.loadFactoryPreset(.largeChamber)

        playerEngine.output = pitchShifter
    }
}

struct RecorderView2: View {
    @StateObject var conductor = RecorderConductor2()
    
    var body: some View {
        VStack {
            Text(conductor.data.isRecording ? "STOP RECORDING" : "RECORD")
                .foregroundColor(.blue)
                .onTapGesture {
                    conductor.data.isRecording.toggle()
                }
                .padding()
            
            Text(conductor.data.isPlaying ? "STOP" : "PLAY")
                .foregroundColor(.blue)
                .onTapGesture {
                    conductor.data.isPlaying.toggle()
                }
                .padding()
            
            Button("Save") {
    let outputURL = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("tempAudioWithFFMPEGEffect\(UUID().uuidString)").appendingPathExtension("caf")

                let newAudioFile = try! AVAudioFile(forWriting: outputURL, settings: [
                    AVFormatIDKey: Int(kAudioFormatLinearPCM),
                    AVSampleRateKey: 44100,
                    AVNumberOfChannelsKey: 2,
                    AVLinearPCMIsBigEndianKey: 0,
                    AVLinearPCMBitDepthKey: 16,
                    AVLinearPCMIsNonInterleaved: 1,
                    AVLinearPCMIsFloatKey: 1,
                    AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
                ])

                try! conductor.playerEngine.renderToFile(newAudioFile, duration: conductor.player.duration) {
                    self.conductor.player.play()
                } progress: { progress in
                    print("progress: \(progress)")
                }

                conductor.play(url: outputURL)
            }
            
            EditFXView(pitchIntensity: $conductor.pitchShifter.shift,
                       reverbIntensity: $conductor.reverb.dryWetMix)
        }
        
        .padding()
        .cookbookNavBarTitle("Recorder")
        .onAppear {
            try? conductor.recorderEngine.start()
            try? conductor.playerEngine.start()
        }
        .onDisappear {
            conductor.recorderEngine.stop()
            conductor.playerEngine.stop()
        }
    }
}

Crash Logs, Screenshots or Other Attachments (if applicable)

No response

I just made it work by playing the audio and then pausing it (even if I pause it immediately). It doesn’t work if the player has never played, or if it is playing while I try to export.

The short fix is to add

try? conductor.recorderEngine.start()
try? conductor.playerEngine.start()

before

conductor.play(url: outputURL)

You don't need 3 engines, and it might lead to some race conditions having the play method right after the render to file (I'm not really sure what is happening behind the scenes there). Here are some quick edits that got it working:


import AudioKit
import AudioKitEX
import AudioKitUI
import AVFoundation
import SwiftUI
import SoundpipeAudioKit
import DunneAudioKit

class RecorderConductor: ObservableObject {
    let recorderEngine = AudioEngine()
    let effectsEngine = AudioEngine()
    
    let player = AudioPlayer()
    let player2 = AudioPlayer()
    
    var recorder: NodeRecorder?
    var silencer: Fader?
    
    @Published var data = RecorderData() {
        didSet {
            if data.isRecording {
                do {
                    try recorder?.record()
                } catch let err {
                    print(err)
                }
            } else {
                recorder?.stop()
            }
            
            if data.isPlaying {
                if let file = recorder?.audioFile {
                    try? player.load(file: file)
                    player.play()
                }
            } else {
                player.stop()
                
            }
        }
    }
    
    func play(url: URL) {
        print("this url \(url)")
        //clean play
        try! player2.load(file: try! .init(forReading: url))
        player2.play()
    }

    init() {
        guard let input = recorderEngine.input else {
            fatalError()
        }
        
        do {
            recorder = try NodeRecorder(node: input)
            player.isLooping = true
            silencer = Fader(input, gain: 0)
            recorderEngine.output = Mixer(silencer!, player2)
            effectsEngine.output = Mixer(Reverb(player))
        } catch let err {
            fatalError("\(err)")
        }
        
    }
}

struct RecorderView: View {
    @StateObject var conductor = RecorderConductor()
    var recordingURL = "tempAudioWithFFMPEGEffect\(UUID().uuidString)"
    var body: some View {
        VStack {
            Text(conductor.data.isRecording ? "STOP RECORDING" : "RECORD")
                .foregroundColor(.blue)
                .onTapGesture {
                    conductor.data.isRecording.toggle()
                }
                .padding()
            
            Text(conductor.data.isPlaying ? "STOP" : "PLAY")
                .foregroundColor(.blue)
                .onTapGesture {
                    conductor.data.isPlaying.toggle()
                }
                .padding()
            Text("PLAY SAVED FILE")
                .foregroundColor(.blue)
                .onTapGesture {
                    let outputURL = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(recordingURL).appendingPathExtension("caf")
                    try? conductor.effectsEngine.start()
                    conductor.play(url: outputURL)
                }
                .padding()
            
            Button("Save") {
                let outputURL = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent(recordingURL).appendingPathExtension("caf")
                let newAudioFile = try! AVAudioFile(forWriting: outputURL, settings: [
                    AVFormatIDKey: Int(kAudioFormatLinearPCM),
                    AVSampleRateKey: 44100,
                    AVNumberOfChannelsKey: 2,
                    AVLinearPCMIsBigEndianKey: 0,
                    AVLinearPCMBitDepthKey: 16,
                    AVLinearPCMIsNonInterleaved: 1,
                    AVLinearPCMIsFloatKey: 1,
                    AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
                ])
                
                try! conductor.effectsEngine.renderToFile(newAudioFile, duration: conductor.player.duration) {
                    self.conductor.player.play()
                } progress: { progress in
                    print("progress: \(progress)")
                }
                if let file = conductor.recorder?.audioFile {
                    self.conductor.player.file=file
                }
                try? conductor.effectsEngine.start()
            }
        }
        
        .padding()
        .cookbookNavBarTitle("Recorder")
        .onAppear {
            try? conductor.recorderEngine.start()
            try? conductor.effectsEngine.start()
        }
        .onDisappear {
            conductor.recorderEngine.stop()
            conductor.effectsEngine.stop()
        }
    }
}
struct RecorderData {
    var isRecording = false
    var isPlaying = false
}

Something sketchy happens when you try to save the file for a second time. The engine resets all nodes after the renderToFile.

I updated my code block to include reloading the audio file after the render method completes. It also uses a second engine which could possible be refactored to one but it was giving me issues. It has something to do with the way the AudioPlayer Node is reconstructed after the engine is stopped.

Thank you @NickCulbertson and sorry for the delay!