AudioKit / AudioKit

Audio synthesis, processing, & analysis platform for iOS, macOS and tvOS

Home Page:http://audiokit.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Using AudioKit from background thread - is it supported?

ufoq opened this issue · comments

Description

Documentation seems to ignore this topic, there is no clear statement if it's safe to use AudioKit from non-main thread. It seems to work, but is not guaranteed / mentioned in docs. The same is true for AVAudioEngine, all examples do not mention threading at all or do everything in main thread.
I do not mean thread-safety, just using audio engine from 1 dedicated audio thread (that is not main thread to prevent UI lags when graph is being prepared etc).

Proposed Solution

Clarify in docs if it is safe to use AudioKit from other thread or if it is meant to be used only from main thread.

Describe Alternatives You've Considered

So far it looks that it's safest to use AudioKit (and AVAudioEngine) from main thread, but it can make UI unresponsive.

Additional Context

In AVFoundation / Core Audio, topic is also ignored, only AUGraph had some guarantees on thread-safety, but it's deprecated now.

I don't really know about the thread part, other than I've heard to keep it on the main thread.

@wtholliday might know more. In a couple of the AudioKit Office Hours he talked about thread safety.

As for the other part, here are a few things you could try to make your app more responsive:
• Set/increase your buffer speed
• Configure AVAudioSession like the Cookbook: https://github.com/AudioKit/Cookbook/blob/main/Cookbook/Cookbook/CookbookApp.swift
• Enable background audio if applicable
• Use PCM buffered for playing back audio
• Run async tasks on the main thread
• Reduce repeated view initializations when using SwiftUI

We've been successfully using AudioKit on a dedicated audio queue in multiple projects.

thread-safety means accessing 1 resource from multiple threads, I don't need it and usually if there isn't written that API is thread-safe, then it is not. What I want is to access audio objects in a serial way (dedicated thread / queue), just not from the main thread.

Even super simple graph with just 1 AVAudioSourceNode and .short buffer takes ~100ms to prepare and play.
Things that you mentioned may help minimize UI lags, but still that code will be performed in the main thread, even if you run async task in main thread queue, that will make animation loose frames.

Things that happen under the hood in AVAudioEngine (Creating audio buffers, querying for audio devices etc) take too long to be executed in the main thread where you have 8-16ms per frame to keep it running smoothly. It'd be better to just treat audio i/o, just like any other i/o. We don't read big files in UI thread, so I guess we also shouldn't launch long-running audio initialization tasks there.

@jcavar I also haven't got into any problems while doing some testing, but there is no guarantees in API headers / docs (and you don't know if it was designed to work in this way), so it may work by accident and it may stop working in any future release or in some other case (data races etc). Also other AVFoundation classes have a history of not working correctly when called outside of main thread.

I know this uncertainty is mainly caused by lack of info in CoreAudio / AVFoundation docs, but maybe AudioKit team has more knowledge from different sources and applied it 'somehow' in AudioKit.