ryanheise / audio_service

Flutter plugin to play audio in the background while the screen is off.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

iOS Support

hacker1024 opened this issue · comments

I'm creating this issue to keep track of an iOS implementation. I myself am an Android developer, and I'd love to see an iOS developer contribute.

To potential iOS contributors

This post describes the work to be done on the iOS side and what might be the best way to break it down.

Android and iOS differences

The architecture of this plugin was really dictated by the nature of Android's component system, so first let's try to understand why things must be the way they are if the plugin is to work on both Android and iOS. In Android, an app is made up of a set of individual components which may be dynamically spawned and destroyed over the lifetime of the app. An audio playing app needs an "Activity" component which is only allowed to run foreground code (i.e. the UI), and a "Service" component which is only allowed to run background code (i.e. playing audio). The idea is that the Activity could be swiped away by the user, i.e. destroyed and freed from memory, while the Service component is allowed to live on and play audio in the background. Flutter automatically sets up an isolate inside the activity to run your Dart code, but since this activity could be swiped away at any time, we can't put the audio playing code here. We therefore use a relatively new Flutter API to create our own isolate attached to the Service for running Dart code, and this is where the audio playing code needs to go. Flutter's API for background execution of Dart code is described in this Medium article.

iOS developers in the thread discussion below have pointed out that things fortunately may be a lot simpler on iOS since there is no such separation of foreground and background code. This may mean that the iOS implementation of the plugin would not need to use Flutter's background execution API. However, to main compatibility with Android's behaviour, I would suggest that the iOS implementation should still spawn a plain old isolate for running the background audio code. This would ensure call-by-value semantics is maintained on both platforms.

Architecture overview

This plugin sets up a Dart isolate for running audio playback code in a way that continues to run when the app is not in the foreground. The dart code that runs in this isolate is called the "background task". The architecture allows one or more "clients" to connect to the background task to control audio playback and be notified of state changes to update the UI. One of those clients is the Flutter user interface, but other clients may include a Bluetooth headset or a smart watch. There are distinct APIs for the Flutter client side and for the background task:

  • The AudioService API is for the Flutter client and allows for starting, stopping and sending/receiving messages to/from the background isolate.
  • The AudioServiceBackground API is for the background isolate and it dictates a set of callbacks that must be implemented to handle messages from the main isolate, and also provides methods to notify state updates back to all clients.

For example, AudioService.play() sends a message to the background task which is handled by its onPlay callback. But the message is not sent directly from point A to point B. Rather, it must go through a platform channel to the native plugin so that the plugin has a chance to do all of the things that a native app must typically do before playing audio. On Android, this involves acquiring audio focus and calling various APIs to allow the app to keep running while the app is not in the foreground and while the screen is turned off (acquiring a wake lock and starting a service) as well as showing a notification to indicate that the app is playing audio in the background. Once all of these things are done, the plugin THEN passes the message on to the Dart background task via its onPlay callback. Regardless of which client the play command originated from, the plugin should handle the command in the same way by performing any required native rituals to prepare for audio playback and then finally pass control to onPlay.

Stage 0: Foundation

A good starting point would be to implement the method call to start the background isolate, and to implement all other method calls either to directly pass through from one isolate to the other, or even implement them as no-ops. The only methods that absolutely need to work in this stage are:

  • start - tell the plugin to start the background isolate.
  • stopped - tell the plugin that the background isolate has completed.

With this much, the demo example app should start playing audio and will stop once the audio has run out by itself, but will not allow clients to control playback and will not notify the clients of state changes.

Stage 1: Basic functionality

The basic functionality provides the ability to control playback with the basic play/pause operations (which can also be controlled with a headset button click), and also the ability to stop the background task on request. The relevant method calls to implement are: connect, disconnect, ready, isRunning, setState/onPlaybackStateChanged, stop/onStop, pause/onPause, play/onPlay, setMediaItem/onMediaChanged and click/onClick. I'll be happy to answer questions about any of these in the comments below.

There are also a set of method calls that allow jumping to different positions and tracks that don't require any special set up on the native side so all they do is forward the messages on to the background task. They are: seekTo/onSeekTo, skipToNext/onSkipToNext, skipToPrevious/onSkipToPrevious, fastForward/onFastForward and rewind/onRewind.

The plugin should also have a default case for custom actions (whose method names begin with "custom_". These are also just forwarded directly to the background task without any special setup on the native side.

Stage 2: Queue functionality

This functionality adds the ability to manipulate the playlist/queue and jump to an arbitrary media items. The relevant method calls are: addQueueItem/onAddQueueItem, addQueueItemAt/onAddQueueItemAt, removeQueueItem/onRemoveQueueItem, skipToQueueItem/onSkipToQueueItem and playFromMediaId/onPlayFromMediaId.

Stage 3: Browsing functionality

This adds the ability for clients to browse media items offered by the background task. The relevant methods are: setBrowseMediaParent, notifyChildrenChanged, onChildrenLoaded and onLoadChildren.

Addressing API differences between Android and iOS

When there are equivalent features on iOS and Android, the preference is to implement both sides with the same API. If things would be easier on the iOS side if the API were changed, please make the suggestion below! I would definitely prefer to change the API if it means that the Android and iOS APIs can be harmonised.

In cases where a purely iOS-specific feature is desirable, it can be added as long as it is named with an ios prefix.

Stage 3: Browsing functionality
This adds the ability for clients to browse media items offered by the background task. The relevant methods are: setBrowseMediaParent, notifyChildrenChanged, onChildrenLoaded and onLoadChildren.

Read the Android docs to see what these methods should actually do: https://developer.android.com/guide/topics/media-apps/audio-app/building-a-mediabrowserservice

I started working on the iOS implementation, but running into several issues.

The flutter background isolate example does not run on iOS.

https://medium.com/flutter-io/executing-dart-in-the-background-with-flutter-plugins-and-geofencing-2b3e40a1a124

13:57:40.809 1 info flutter.tools Launching lib/main.dart on iPhone XS Max in debug mode...
13:57:49.494 2 info flutter.tools Running pod install...
13:57:50.791 3 info flutter.tools Running Xcode build...
13:57:51.973 4 info flutter.tools Xcode build done.                                            1.2s
13:57:52.914 5 info flutter.tools Failed to build iOS app
13:57:52.914 6 info flutter.tools Error output from Xcode build:
13:57:52.914 7 info flutter.tools ↳
13:57:52.914 8 info flutter.tools Could not build the application for the simulator.
13:57:52.914 9 info flutter.tools Error launching application on iPhone XS Max.
13:57:52.914 10 info flutter.tools     ** BUILD FAILED **
13:57:52.914 11 info flutter.tools 
13:57:52.914 12 info flutter.tools 
13:57:52.914 13 info flutter.tools Xcode's output:
13:57:52.914 14 info flutter.tools ↳
13:57:52.914 15 info flutter.tools     === BUILD TARGET geofencing OF PROJECT Pods WITH CONFIGURATION Debug ===
13:57:52.914 16 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:26:8: error: unknown type name 'FlutterPluginRegistrantCallback'
13:57:52.914 17 info flutter.tools     static FlutterPluginRegistrantCallback registerPlugins = nil;
13:57:52.914 18 info flutter.tools            ^
13:57:52.914 19 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:40:38: error: expected a type
13:57:52.914 20 info flutter.tools     + (void)setPluginRegistrantCallback:(FlutterPluginRegistrantCallback)callback {
13:57:52.914 21 info flutter.tools                                          ^
13:57:52.914 22 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:125:42: warning: 'center' is deprecated: first deprecated in iOS 7.0 - Please see CLCircularRegion [-Wdeprecated-declarations]
13:57:52.914 23 info flutter.tools       CLLocationCoordinate2D center = region.center;
13:57:52.914 24 info flutter.tools                                              ^
13:57:52.914 25 info flutter.tools     In module 'CoreLocation' imported from /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.h:6:
13:57:52.914 26 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/CoreLocation.framework/Headers/CLRegion.h:78:56: note: 'center' has been explicitly marked deprecated here
13:57:52.914 27 info flutter.tools     @property (readonly, nonatomic) CLLocationCoordinate2D center API_DEPRECATED("Please see CLCircularRegion", ios(4.0, 7.0), macos(10.7, 10.10)) API_UNAVAILABLE(tvos);
13:57:52.914 28 info flutter.tools                                                            ^
13:57:52.914 29 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:142:20: warning: 'setAllowsBackgroundLocationUpdates:' is only available on iOS 9.0 or newer [-Wunguarded-availability]
13:57:52.914 30 info flutter.tools       _locationManager.allowsBackgroundLocationUpdates = YES;
13:57:52.914 31 info flutter.tools                        ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
13:57:52.914 32 info flutter.tools     In module 'CoreLocation' imported from /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.h:6:
13:57:52.914 33 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/CoreLocation.framework/Headers/CLLocationManager.h:266:35: note: 'setAllowsBackgroundLocationUpdates:' has been explicitly marked partial here
13:57:52.914 34 info flutter.tools     @property(assign, nonatomic) BOOL allowsBackgroundLocationUpdates API_AVAILABLE(ios(9.0), watchos(4.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(tvos);
13:57:52.914 35 info flutter.tools                                       ^
13:57:52.914 36 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:142:20: note: enclose 'setAllowsBackgroundLocationUpdates:' in an @available check to silence this warning
13:57:52.914 37 info flutter.tools       _locationManager.allowsBackgroundLocationUpdates = YES;
13:57:52.914 38 info flutter.tools                        ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
13:57:52.914 39 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:220:74: warning: sending 'const NSString *__strong' to parameter of type 'NSString * _Nonnull' discards qualifiers [-Wincompatible-pointer-types-discards-qualifiers]
13:57:52.914 40 info flutter.tools       NSMutableDictionary *callbackDict = [_persistentState dictionaryForKey:key];
13:57:52.914 41 info flutter.tools                                                                              ^~~
13:57:52.914 42 info flutter.tools     In module 'Foundation' imported from /Users/alex/Documents/project/FlutterGeofencing/example/ios/Pods/Headers/Public/Flutter/Flutter/FlutterBinaryMessenger.h:8:
13:57:52.914 43 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSUserDefaults.h:93:73: note: passing argument to parameter 'defaultName' here
13:57:52.914 44 info flutter.tools     - (nullable NSDictionary<NSString *, id> *)dictionaryForKey:(NSString *)defaultName;
13:57:52.914 45 info flutter.tools                                                                             ^
13:57:52.914 46 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:220:24: warning: incompatible pointer types initializing 'NSMutableDictionary *' with an expression of type 'NSDictionary<NSString *,id> * _Nullable' [-Wincompatible-pointer-types]
13:57:52.914 47 info flutter.tools       NSMutableDictionary *callbackDict = [_persistentState dictionaryForKey:key];
13:57:52.914 48 info flutter.tools                            ^              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
13:57:52.914 49 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:222:18: warning: incompatible pointer types assigning to 'NSMutableDictionary *' from 'NSDictionary *' [-Wincompatible-pointer-types]
13:57:52.914 50 info flutter.tools         callbackDict = @{};
13:57:52.914 51 info flutter.tools                      ^ ~~~
13:57:52.914 52 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:223:53: warning: sending 'const NSString *__strong' to parameter of type 'NSString * _Nonnull' discards qualifiers [-Wincompatible-pointer-types-discards-qualifiers]
13:57:52.914 53 info flutter.tools         [_persistentState setObject:callbackDict forKey:key];
13:57:52.914 54 info flutter.tools                                                         ^~~
13:57:52.914 55 info flutter.tools     In module 'Foundation' imported from /Users/alex/Documents/project/FlutterGeofencing/example/ios/Pods/Headers/Public/Flutter/Flutter/FlutterBinaryMessenger.h:8:
13:57:52.914 56 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSUserDefaults.h:82:57: note: passing argument to parameter 'defaultName' here
13:57:52.914 57 info flutter.tools     - (void)setObject:(nullable id)value forKey:(NSString *)defaultName;
13:57:52.914 58 info flutter.tools                                                             ^
13:57:52.914 59 info flutter.tools     /Users/alex/Documents/project/FlutterGeofencing/ios/Classes/GeofencingPlugin.m:231:46: warning: sending 'const NSString *__strong' to parameter of type 'NSString * _Nonnull' discards qualifiers [-Wincompatible-pointer-types-discards-qualifiers]
13:57:52.914 60 info flutter.tools       [_persistentState setObject:mapping forKey:key];
13:57:52.914 61 info flutter.tools                                                  ^~~
13:57:52.914 62 info flutter.tools     In module 'Foundation' imported from /Users/alex/Documents/project/FlutterGeofencing/example/ios/Pods/Headers/Public/Flutter/Flutter/FlutterBinaryMessenger.h:8:
13:57:52.914 63 info flutter.tools     /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.0.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSUserDefaults.h:82:57: note: passing argument to parameter 'defaultName' here
13:57:52.914 64 info flutter.tools     - (void)setObject:(nullable id)value forKey:(NSString *)defaultName;
13:57:52.914 65 info flutter.tools                                                             ^
13:57:52.914 66 info flutter.tools     7 warnings and 2 errors generated.

If anyone knows how the above can be fixed, please let me know. In the meantime, I will working on a standalone audio_player_service plugin that does not use a background isolate, as it doesn't seem to be required for iOS.

Also, the audio_service project was created with an older ios template, which was also broken. I created a new repo and moved over the android code, which fixed it, so I recommend recreating this repo with the latest flutter plugin template.

Hi @alexelisenko and first thank you so much for trying to get the iOS side working.

I unfortunately don't own a Mac to test this on, however I did do a Google search for the missing type name "FlutterPluginRegistrantCallback" and found that apparently this type was only committed to git very recently: flutter/engine@1ba3295#diff-dc093c279508589dcc672b1f75d69a11

So your two options would be:

  1. Check out an older version of the FlutterGeofencing repo; or
  2. Upgrade to the latest version of the Flutter engine.

It is probably better to do the latter since there's a chance you'll want to use the latest APIs.

First up, of course:

flutter upgrade

If the plugin still doesn't work, then you need to switch to a channel that is updated more regularly. First, you can try the beta channel which is what I personally use:

flutter channel beta

The beta channel is updated once a month, so I'd expect it should have this missing datatype since it was added last month. If it doesn't have the missing data type yet, then you can try:

flutter channel dev

You could then switch back to beta within a month after the missing datatype has had time to make its way into beta.

Regarding the project template, I agree and would love to update it, although would you like me to generate a Swift or Objective C template?

@ryanheise switching to the beta channel resolved the error, thanks.

I think you should go with Objective-C since it reduces the complexity of setup for the background execution. The swift template actually uses both swift and obj-c by adding bridge headers, which complicates the setup and examples that are available for background execution.

I am currently flushing out all the iOS requirements in a standalone xcode project before integrating into the plugin.

Waiting for this implementation 👍

@alexelisenko you've become a very popular person :-)

Regarding the iOS template, I'll try to update it tonight.

I watched an introductory video about Objective C recently, and while it's still a bit foreign to me, I might try my hand at filling in the basic plugin structure in a separate branch (it's bound to have syntax errors though since I don't have a Mac / Xcode to test it on). Swift looks a bit easier to me, but I'll trust you that the bridge headers add complexity.

After playing around with the Flutter background Dart execution, I abandoned that approach for iOS to keep my project deadlines. The good news is I do have a working audio service for iOS (For Audio and Video files), which I will share here soon (have a few tweaks to make before making it public).

This does mean that I had to resort to using things like if(platform.isIOS){}, but this was honestly the fastest route for my project. Once I publish my version (which does not use background dart code), anyone who needs this functionality can at least use it, but I do hope to have time to merge my work with this plugin, I just didn't have enough time to fiddle with the background dart implementation.

@ryanheise We could potentially create a wrapper interface for my iOS plugin, without adding the background dart execution. One of the main reasons I wrote my own plugin for iOS, was that the audioplayer plugin that is typically used with audio_service was not feature complete. The audio_player_service plugin I wrote is basically the service and player rolled into one, since iOS does not require much for background audio playback, so the background dart execution added alot of code and moving parts that aren't needed.

I will post the link to the plugin here, hopefully in the next few days.

Here is the iOS plugin:
https://github.com/alexelisenko/audio_player_service

I would consider this in BETA. While it does work, it is lacking in documentation and thorough testing.

The example project does give you everything you need to use it, but it does not include examples of all the methods available, so some code reading will be required for some use cases.

Please keep in mind that I will be making updates to this repo, which may or may not be breaking changes.

EDIT I still intend to integrate this into audio_service, may need to have a different interface for iOS, or create the same interface with some of the methods doing nothing. Once Im done with my active project I will circle back to this, but for now, I need to get my project done :)

Excellent! I fully expect iOS to have a different, though partially overlapping, set of features, so it will be interesting to get an iOS developer's input to help shape the audio_service API.

Does your plugin handle button clicks on the headset to play/pause media playback, and if so, which part of the code is responsible for it? I'd be interested to know if this is just handled automatically, or whether iOS gives you the flexibility to handle these button clicks with your own callbacks. After a stack overflow search, I found this (https://stackoverflow.com/questions/9797562/iphone-headset-play-button-click) which gives me hope that it's possible.

iOS uses whats called the CommandCenter to display media metadata and provide controls.

The plugin supports the following

togglePlayPauseCommand
playCommand
pauseCommand
stopCommand
nextTrackCommand
previousTrackCommand
changePlaybackPositionCommand (seeking from command center/ external devices)

The unimplemented commands available are:

[commandCenter.skipForwardCommand setEnabled:NO];
[commandCenter.skipBackwardCommand setEnabled:NO];
[commandCenter.enableLanguageOptionCommand setEnabled:NO];
[commandCenter.disableLanguageOptionCommand setEnabled:NO];
[commandCenter.changeRepeatModeCommand setEnabled:NO];
[commandCenter.seekForwardCommand setEnabled:NO];
[commandCenter.seekBackwardCommand setEnabled:NO];
[commandCenter.changeShuffleModeCommand setEnabled:NO];
        
// Rating Command
[commandCenter.ratingCommand setEnabled:NO];
        
// Feedback Commands
 // These are generalized to three distinct actions. Your application can provide
// additional context about these actions with the localizedTitle property in
// MPFeedbackCommand.
[commandCenter.likeCommand setEnabled:NO];
[commandCenter.dislikeCommand setEnabled:NO];
[commandCenter.bookmarkCommand setEnabled:NO];

The above can be added very easily.

I have currently tested this with various bluetooth headphones and over bluetooth audio in several cars

As far as what code is responsible, look at this file: https://github.com/alexelisenko/audio_player_service/blob/master/ios/Classes/AudioPlayer.m

The init method sets up the CommandCenter with the callbacks.

NOTE: The iOS simulator does not show the command center, so it has to be tested on a device, unfortunately.

So then when you click the headset button, does iOS itself maintain a play/pause toggle state and flip it on a click? That would be unfortunate, since what I'm really hoping for is to be able to just listen to button click events and allow the application to process them and manage its own state.

Each of the command I listed requires a callback, which are implemented in the plugin. The state is managed by the plugin entirely, meaning it's the plugins job to decide what to do with the command events.

@adriancmurray Yes, the plugin actually requires that you initialize a Queue of items (which can be just one item). The plugin does not automatically play the next item in the queue, although this could be added easily. Im currently playing the next item in the queue in the actual flutter project using the plugin by watching that playback state and running next as needed.

@adriancmurray There is one known bug with this though:

When supplying multiple items in the queue, any bluetooth devices that display the queue position, i.e. 2/10 etc, do not display the correct index, Im still working on a fix

This is related to not being able to easily go to the previous item, and having to reinit the queue, which screws up the metadata for external devices that rely on this data.

This does not affect devices that do not look for this data, or general playlist playback via command center

Ah, nice! So for example, on receiving a togglePlayPauseCommand, the application in theory has the power to decide based on runtime conditions to ignore the click without necessarily running the risk of falling out of sync with iOS's internal toggle state (because there is no iOS internal toggle state).

Well... It sounds like you have all of the features I would need for my own use case! 👍

@ryanheise Yes, if the callback does not actually play or pause the Player, the command center will not flip the play/pause button state. The state is provided to the OS via the nowPlayingInfo property of the CommandCenter. This property is updated by the plugin whenever the plugin state changes.

I see, that is more or less the way it also works in Android, so it should fit nicely into the current plugin API.

I've just copied the latest iOS template across to audio_service and will look more in depth at your iOS code tomorrow.

@alexelisenko Interesting.
I'm actually going to publish my own audio player plugin soon, which is designed for queuing URLs. Perhaps, when I publish it, you could decouple you media player code and move it into there? (It's a really simple plugin, just all your usual media and queue management functions to implement).

That way you're one step closer to the way this plugin works.

@hacker1024 What do you mean by queuing URLs?

@alexelisenko The plugin maintains a list of audio URLs to stream and will play through them. It preloads upcoming queue items, which is what I couldn't get any other plugin to do.

EDIT: The Android implementation uses ExoPlayer's ConcatentatingMediaSource.

@hacker1024 So if you pass it a list of URLs, it will download the, then load them from local filesystem?

@alexelisenko Yes, but to RAM or a cache only. The downloads aren't made to be accessed.

Great Job Guys. Waiting...

For those waiting for background audio plugins... might I suggest just writing it yourself. I was hoping for a plugin to be made and briefly used one mentioned on here earlier in the thread (as in I ran my own code on it for about 5 hours until I realized it wasn't going to offer what I needed). My needs are rather specific so I decided to code one myself in swift. It's not that difficult and I had never coded in swift up until writing my own plug in. This allows you a whole lot more freedom in how you implement the native APIs (and there are many for audio). I tried to alter some of the existing plugins for iOS but they all use Objective C, which kind of feels like reading a book written in the 1800's when you're used to languages like Dart/Javascript, so I used Swift. This also allows you to have a tighter coupling between your dart code and the native APIs. Hope this helps. Thanks to everyone who has been working on audio plugins. Cheers!

commented

I don't know how to write a flutter plugin. Can you send some links about starting to code a flutter plugin? Also may you share your background audio plugin?

A question for iOS developers: would it in fact be possible to implement the iOS side of this plugin without using the Flutter background execution API?

On Android, we must use the background execution API to create a Dart isolate that lives within a Service component, separate from the default Dart isolate that lives within the Activity component because the Android allows the Activity (UI) to be stopped and even destroyed and cleared from memory while the background Service should be able to live on playing audio.

My understanding is that in iOS, an application isn't required to be segmented into independent foreground and background components like this, and that the background execution API would only be needed for cases where an external event outside the app needs to trigger Dart code in the app to run(*). So my hope is that we could simplify our efforts for the iOS side by just implementing it using standard techniques that most iOS Flutter plugin developers will already be familiar with. If this is possible, then "theoretically" we wouldn't even need to spawn a separate isolate for the background task, however I think it would still be a good idea to spawn one just to prevent users of this plugin from relying on shared memory on iOS only to find that their code does not work on Android.

(*) Perhaps one potential use case for audio_service could be if we want the play/pause button to start up the app if it's not already running and begin playing audio. audio_service doesn't support this use case yet anyway, and I also don't know if this is a capability of iOS (it is a capability of Android), but sounds like the sort of thing that could require the background execution API.

Thanks for the informative answer, @adriancmurray . I am certainly looking at this from the perspective of how audio_service works, so the goal would be to let users of this plugin write all of their own audio logic in Dart so that they are in complete control of whether they want to play music or play text to speech or even play synthesized audio. The audio_plugin itself stays out of the business of actually playing the audio but will provide "everything else":

  • The ability to set what is displayed as the currently playing media
  • The ability to respond to inputs from peripherals such as the play/pause button on your headset, or audio controls in your car's audio system, or controls in the system notification or lock screen.
  • Any necessary rituals to allow for background audio (in Android, registering a wakelock, etc. but on iOS perhaps nothing?)

Perhaps then in the iOS implementation, we would not need to interface with AVPlayer, but we would need to interface with CommandCenter. Given these goals, do we still see any need for the background execution API for iOS?

Your idea with splitting service abstraction and player abstraction sounds good. In my case MediaPlayer is not enough and i should use ExoPlayer and service abstraction works quite good. But in iOS background execution released via checkbox in xcode. And then you just operate with some static objects.

I'm not mobile developer at all. I come here from web-dev. But I contribute iOS implementation to https://github.com/thyagoluciano/flutter_radio. There much less functionality then in audio_service, but I investigate some cases like background audio. But you still need to update command center (lock screen) manually: https://github.com/thyagoluciano/flutter_radio/blob/ff2bd1b85ae1064f4582ec1c1a93f5af3fcf6f11/ios/Classes/FlutterRadioPlugin.m#L312

I`m suffering from Objective-C syntax. Impressed by https://gitlab.com/exitlive/music-player I want to refactor all my iOS code to swift. And there is a chance to split realization of service and player.

@adriancmurray @adriancmurray @alexelisenko @hacker1024 @ryanheise

thank you for contributing to this task,

I just want to know if someone is working on this important task ?

It would be great to get the ball rolling, even in a small way, but it will take someone with sufficient motivation and sufficient time to overcome the initial barrier to entry.

Supposing that we had a basic foundation in place with a minimal amount of iOS code required to get Stage 0 working, I think then it would be a lot easier for other iOS contributors to contribute a feature or two here and there, rather than one person doing it all.

So in terms of overcoming the initial barrier to entry, would any iOS developer be interested in working with me to get Stage 0 in place?

Im interested to implement stage 0, but Im not mobile developer and super busy for several months. Currently I can help only by sharing my iOS implementation of background audio in another plugin: https://github.com/thyagoluciano/flutter_radio/blob/ff2bd1b85ae1064f4582ec1c1a93f5af3fcf6f11/ios/Classes/FlutterRadioPlugin.m#L312

Thanks for that. Yes, I think there's now probably enough code shared by you and others above which would be really helpful to someone implementing this in the iOS side of the plugin.

But understandably we all have other priorities at the moment.

While I personally also have other priorities at the moment, and to make matters worse, I don't have a Mac or an iPhone, I do at least now have MacInCloud which brings me one step closer to being able to work on this.

I have two iPhones of different models so I could test

While I personally also have other priorities at the moment, and to make matters worse, I don't have a Mac or an iPhone, I do at least now have MacInCloud which brings me one step closer to being able to work on this.

I don't either, but I've built a Hackintosh which can run the iOS Simulator. It's fairly straightforward. If you have a Linux computer, there's an easy way to create a VM: foxlet/macOS-Simple-KVM.

I've also managed to put together a fully functional iPhone SE with parts I found at my local E-Waste recycling bin - I recommend looking at one if you can.

Anyway, I'd be very happy to test things on iOS, since I really want to get my Pandora client working on iPhones.

Now that I've got a working Hackintosh, I'm going to learn iOS development and try and implement the iOS side.

@hacker1024 , cool! You might also want to check with @alexelisenko as he might have also at least done some initial sketching of an iOS plan. The key points are that the iOS implementation should not need to do any Flutter background execution. However, to maintain compatibility with the Android side which does need this, the iOS side can just directly use the Isolates API to start the background task within a new isolate. The plugin's "start" method can use the various iOS APIs to initialise things for audio, such as activating the AVAudioSession and setting callbacks on the MPRemoteCommandCenter.

We should eventually document the separation of concerns between this plugin and other audio plugins that can interoperate with this one: this plugin is reponsible for setting up the AVAudioSession on iOS and the MediaSession on Android. Other plugins which just play audio should not set these things themselves. Happily, this seems to be the current situation anyway with other audio plugins I've checked, but it could help to make it clear where the boundary between responsibilities is.

@hacker1024 (and @alexelisenko ?) I now also am at the stage I have the need for iOS and would be happy to team up with you on this. I don't have full-time access to a Mac, but I should have sporadic access to one about 3 times a week which is far better than nothing, and urgency will help push me to make something happen

(I do have MacInCloud, but the affordable plan I'm on doesn't give me root access which is needed to get over a flutter bug that prevents me from seeing the flutter logs and doing hot reload rendering it useless for hard development.)

In case you haven't done any work on this yet, I'd be happy to start by laying some structural foundations, setting up an isolate for the iOS side. If you'd like to collaborate, I'll set up a Trello board with a plan.

@ryanheise I'd also be happy to help out on the iOS side if you need more hands/an extra mac. I have limited experience developing for iOS but keen to learn.

Thanks, @alexandergottlieb , for the offer to help! I'll try my hand at implementing Stage 0 over the coming days, and hopefully build a base that others such as yourself can then contribute on top of.

That's awesome! This requirement has come up recently for a project I'm working on, and next week or so I also should be available to work on this.
I have a mac,, but no iOS knowledge (yet)

Quick update: I've been hacking away at the iOS side for about a day and a half, and have got the isolates working, and am able to connect, and start audio playback. I'm going to need some help with the trickier Objective C code, so I plan to push a new branch with what I've done so far and see if some of us can work together.

🎁 🎁 🎁

I've just committed an initial iOS implementation to the iosdev branch!

If you think you can help out, please take a look at the TODO comments I've scattered throughout for an idea of what needs to be done. Now that there's actually a place to put the pieces of code, and given the many comments and links above to the pieces of code we need, hopefully this should actually be relatively straightforward.

If you'd like to tackle any of the TODO items, I'd suggest either commenting here or opening a new issue to announce your intention to work on it so that others know not to work on it, and then submit a pull request to the iosdev branch. I'll merge this onto master once enough of the core features are working.

Thanks! I've opened a pull request for initialising AVAudioSession and MPRemoteCommandCenter. Just muddling through docs and tutorials so I hope it's along the right lines!

Great, thanks for the contribution! I've left some comments on the pull request. One of the links from above which should be helpful in relation to using these APIs is Alexander's project: https://github.com/alexelisenko/audio_player_service .

@alexandergottlieb your pull request didn't compile for me, but I've fixed the errors, and also commented out one line which was causing a runtime error. I'm not sure why, but this line:

[commandCenter.changePlaybackPositionCommand addTarget:self action:@selector(changePlaybackPosition)]

caused this error:

*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '+[NSInvocation _invocationWithMethodSignature:frame:]: method signature argument cannot be nil'

(edit: I've now fixed the bug)

That said, I'm sadly unable to test much of this on the simulator. Does anyone know how I can access the control center on the iPhone simulator?

@ryanheise support for the control center got dropped with the ios 11 simulator. I was able to work on & test @alexelisenko 's audio_player_service by downloading ios 10 and running the simulator with that version.

Thanks, @DGempler !

I have just committed some code to update the nowPlayingInfo.

I should also say, this code doesn't actually work ;-) But, it's progress, at least.

I have one more day with this MacBook for the week, and can hopefully make something happen before then, but would also appreciate some eyes on the code, particularly if you may know what I'm missing that's preventing the Control Center from displaying anything.

Committed a fix for the nowPlayingInfo not showing. It turns out that I was calling it to early, and it has no effect unless audio is actually playing.

The iOS implementation is now more or less working in the iosdev branch. Thanks to @alexandergottlieb for contributing initialisation code and many others who provided helpful links to relevant iOS code, in particular @alexelisenko !

Note: If you want to use a flutter channel more recent than stable, you need to fix a bug in flutter_isolate (See pull request rmawatson/flutter_isolate#27). It may be a better option to not have this dependency on flutter_isolate and just use the FlutterEngine API directly.

However, for now, this is all I am able to do, as the time has come to return the borrowed MacBook Pro.

I have implemented enough of what I needed for my own use cases, but it is still missing things like rendering album art and queue management, and there are bound to be some quirks and bugs.

Going forward, I plan to mainly accept pull requests for any remaining features, and I will leave the iOS code in the iosdev branch until the above issue with flutter_isolate is resolved.

That's great news! I can probably have a crack at album art and queue management over the next couple of weeks.

Great! Thank you and look forward to that.

In other news, I contacted the author of flutter_isolate and he added me as a collaborator and uploader on his project so that I could accept and publish the pull request. Therefore, the iosdev branch should now work on the flutter beta channel.

I no longer have the MacBook, but I was able to compile and run things in my MacInCloud account and at least can confirm that there is no longer any error under beta, but I am unable to hear the audio in that environment so I would appreciate if anyone could test the iosdev (on flutter beta) on their iOS device or simulator.

If it works, I'll merge the branch to master and publish, even with the quirks and missing features, and hopefully that will bring in more bug reports and accelerate progress.

@ryanheise I just tested the latest commit on an iPhone 5s. Playing/pausing via control centre is working.

With the AudioPlayer example, the audio pauses when the app is backgrounded, and resumes when the app resumes.

TextToSpeech is the same, except the sound stops working after the app resumes.

Video: https://youtu.be/ewQFY9po-9s

Hmm, I can't test this anymore, but when I tested it on the simulator, it seemed to work fine in the background. Did it also pause with the screen off? Do you see any play/pause logs via NSLog?

I tried running on the Simulator (iPhone 11, iOS 13.2.2) and got an exception (trace below). It seems the command center actions cannot be void callbacks, so I've updated those to match the right signature (for now, always returning the success status). PR: #102

Show exception trace:
2019-11-19 08:07:40.280938+0100 Runner[56811:447294] *** Assertion failure in -[MPRemoteCommand addTarget:action:], /BuildRoot/Library/Caches/com.apple.xbs/Sources/MediaPlayer_Sim/MobileMusicPlayer-4017.200.33/SDK/MPRemoteCommand.m:134
2019-11-19 08:07:40.296804+0100 Runner[56811:447294] *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Unsupported action method signature. Must return MPRemoteCommandHandlerStatus or take a completion handler as the second argument.'
*** First throw call stack:
(
	0   CoreFoundation                      0x00007fff23c4f02e __exceptionPreprocess + 350
	1   libobjc.A.dylib                     0x00007fff50b97b20 objc_exception_throw + 48
	2   CoreFoundation                      0x00007fff23c4eda8 +[NSException raise:format:arguments:] + 88
	3   Foundation                          0x00007fff256c9b61 -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:] + 191
	4   MediaPlayer                         0x00007fff279daf40 -[MPRemoteCommand addTarget:action:] + 1154
	5   audio_service                       0x0000000105c6c5c5 -[AudioServicePlugin handleMethodCall:result:] + 1493
	6   Flutter                             0x0000000103c894fd __45-[FlutterMethodChannel setMethodCallHandler:]_block_invoke + 104
	7   Flutter                             0x0000000103c22ec0 _ZNK7flutter21PlatformMessageRouter21HandlePlatformMessageEN3fml6RefPtrINS_15PlatformMessageEEE + 166
	8   Flutter                             0x0000000103c26780 _ZN7flutter15PlatformViewIOS21HandlePlatformMessageEN3fml6RefPtrINS_15PlatformMessageEEE + 38
	9   Flutter                             0x0000000103c83db3 _ZNSt3__110__function6__funcIZN7flutter5Shell29OnEngineHandlePlatformMessageEN3fml6RefPtrINS2_15PlatformMessageEEEE4$_31NS_9allocatorIS8_EEFvvEEclEv + 57
	10  Flutter                             0x0000000103c353f1 _ZN3fml15MessageLoopImpl10FlushTasksENS_9FlushTypeE + 123
	11  Flutter                             0x0000000103c3a742 _ZN3fml17MessageLoopDarwin11OnTimerFireEP16__CFRunLoopTimerPS0_ + 26
	12  CoreFoundation                      0x00007fff23bb2944 __CFRUNLOOP_IS_CALLING_OUT_TO_A_TIMER_CALLBACK_FUNCTION__ + 20
	13  CoreFoundation                      0x00007fff23bb2632 __CFRunLoopDoTimer + 1026
	14  CoreFoundation                      0x00007fff23bb1c8a __CFRunLoopDoTimers + 266
	15  CoreFoundation                      0x00007fff23bac9fe __CFRunLoopRun + 2238
	16  CoreFoundation                      0x00007fff23babe16 CFRunLoopRunSpecific + 438
	17  GraphicsServices                    0x00007fff38438bb0 GSEventRunModal + 65
	18  UIKitCore                           0x00007fff4784fb48 UIApplicationMain + 1621
	19  Runner                              0x00000001038c2db8 main + 72
	20  libdyld.dylib                       0x00007fff51a1dc25 start + 1
	21  ???                                 0x0000000000000001 0x0 + 1
)
libc++abi.dylib: terminating with uncaught exception of type NSException

With that it does work properly (plays in the background and with the screen off) in the simulator 🎉

... But not on a physical device. Testing on a 5s (12.4.2) and 6s Plus (13.1.3), the audio stops when the app is backgrounded/screen is switched off. Nothing printed to the console.

Thanks for that, it seems to work on the MacInCloud simulator, too. This is a good start, I think.

On a physical device, after starting the audio player and then pressing the home button to put the app into the background, and after as you reported the audio stops by itself, is the play/pause toggle button still available in the control center? And are you able to click that button while the app is in the background and hear the audio resume?

The media controls go back to default after pressing the home button, tapping play does nothing:

I see. I am still tempted to merge the iOS branch as is, since having it not crash on iOS and work in foreground mode is itself an improvement. I'll start writing some documentation for iOS.

Next week I may be able to get my hands on a Mac and iPhone for some further testing so hopefully that will also help in cracking this mystery.

Hey @ryanheise if you would like any help with testing i am all in. I have an iPhone 7 and a Mac. If you want me to test anything specific give me a heads up.

Thanks, @snaeji , much appreciated. Right now, the priority is to figure out why iOS is not allowing the example app to continue running in the background. Can you also test audio_player_service linked above and see if you can spot the difference between the two projects as to what audio_service is missing that would enable background execution on iOS?

Hi, I'm just super happy that this is about to land.
maybe this can help in keeping audio alive in the background? https://www.reddit.com/r/FlutterDev/comments/dvweng/request_for_community_help_flutter_music_player/f7vli3n?utm_source=share&utm_medium=web2x

@bardiarastin that's it, nice find!

I added the below to Info.plist and it's working perfectly on device. Video

<key>UIBackgroundModes</key>
<array>
	<string>audio</string>
</array>

PR: #103

Hi @bardiarastin , @alexandergottlieb , I was SURE I had put those exact 4 lines into the Info.plist file, and just checking now, indeed I did:

992c451

So I'm not sure where those 4 lines went???

Ahah, I accidentally clobbered over it after I regenerated the iOS project to make it swift compatible! I've accepted your pull request.

I've almost finished writing up the iOS documentation.

By the way, @alexandergottlieb , I'm writing up a list of features and which ones work and don't work for Android and iOS. Do you see any sort of media controls on the lock screen on iOS that allow you to play/pause, etc, and do they work with the plugin?

Yes, looking good. Play/pause works as expected on the lock screen, and we get received onPause over method channel printed to the console.

Its running great on iPhone 7 iOS 13.2.3! Found one bug not related to the iOS side i think so i'll make an issue for it.

Version 0.5.0 has landed with iOS support!!! Thanks everyone who contributed. And thanks to everyone else for your patience :-) Of course there is still a lot more work to be done, but at least we now have a foundation to build on.

BTW, I'm sure 0.5.1 will be the version that actually works, but if there are any issues (e.g. if something happened in the merge), I'll find out tomorrow morning after I wake up ;-)

@ryanheise awesome, I'm going to use this in a music app on both android and ios, our app gonna have a lot of users, I'll let you know if I face any issue.

Hi all, I've just published version 0.5.3 which adds a lot of missing iOS features including album art, queue management, updating nowPlayingInfo correctly from all states as well as fixing bugs on various pass-throughs that weren't working including onClick. Although, I'm not actually able to test onClick without an actual iPhone and headphones with a play/pause button (which I assume should be routed through to onClick).

One other core feature that's not implemented yet is the callback to handle when the user attempts to seek to a position from the control center.

Also, in case anyone is interested, I recently published a new audio player plugin called just_audio with iOS and Android support. You can read more about why I created it in #109 but basically:

I am considering building my own audio player plugin that is guaranteed to maintain compatibility with audio_service while also making it possible to build all the ultimate features we want in an audio player. So if you tell me what features would go into your ideal audio player, I may see what I can do.

@ryanheise, I just finished reading through the API docs, and I am very excited to switch to this package.
There's a bunch of things I love about your API surface.
Two things in particular I like:

  1. There's no play-from-source method. First, set the source. Then, play. It makes keeping track of state MUCH simpler.
  2. The set-source methods return (a future resolving to) the audio duration! Yes!

@yringler I'm glad you agree about the API, and thanks for posting the first feature request over on the new project page.

macOS support for Flutter has been announced. How similar is it to iOS?

@hacker1024 I haven't taken a look at the Flutter plugin architecture for macOS yet, but macOS and iOS share a lot of the foundation framework libraries, so I would expect a lot of a plugin's platform code to be reusable.

Along with all the other Flutter news, I was excited to see this: https://marketplace.visualstudio.com/items?itemName=codemagic.remote-mac

@hacker1024 I haven't taken a look at the Flutter plugin architecture for macOS yet, but macOS and iOS share a lot of the foundation framework libraries, so I would expect a lot of a plugin's platform code to be reusable.

Along with all the other Flutter news, I was excited to see this: https://marketplace.visualstudio.com/items?itemName=codemagic.remote-mac

That's good to hear.

Have you seen this? It could probably even run in the WSL 2.x on Windows.
https://github.com/foxlet/macOS-Simple-KVM

There's also Darling, which can run macOS compiler tools on Linux/the WSL.
https://github.com/darlinghq/darling

Also, in case anyone is interested, I recently published a new audio player plugin called just_audio with iOS and Android support.

@ryanheise this looks nice, can it preload URLs further down in the queue so there's no wait when playing the next song?

@hacker1024 You can preload the next song by instantiating another instance of the audio player and calling setUrl in advance of when you need it to play.

I've heard of the KVM solution and it definitely would help me since what I need is a way to run and debug my code. However, I'm not sure if that would meet Apple's Terms of Service.

However, I'm not sure if that would meet Apple's Terms of Service.

Probably not, but they deliberately turn a blind eye to this kind of thing - they're not going to sue you. In fact, certain Hackintosh kexts are actually whitelisted in macOS itself, so Apple help it happen.

Hi, would be pretty good, if via your plugin we can configure hide/show NextTrack, PreviousTrack, SeekToPlaybackPosition for iOS notification.

@ryanheise Hi, thanks for all awesome work you're putting on this I wanted to know how are things going with remaining ios stuff?

image

Thank you, @bardiarastin

There are some other items on my priority list right, such as Android v2 support, and also web support for just_audio, although I do also welcome pull requests especially on the iOS side to help the project move forward more swiftly (no pun intended!).

Using the Example on a physical iPhone iOS 13. I don't see the command center, is the example old and not updated to use command center?

I don't have a physical iPhone, but my understanding was that the command center is always accessible and there is nothing my plugin can do to make it inaccessible. Please correct me if this is not the case, as I don't have a physical device to check. Alternatively, do you mean you can see the command center, but you just don't see any media displayed on it when you call AudioServiceBackground.setMediaItem ?

The command center is there but it doesn’t show the current playing media. It is just blank and interactions with the buttons don’t do anything.

I am using the Example app from this code base. I have not altered it in any way. Just installing it on my phone.

I do see from previous comments that the command center does indeed work.

ios not show music control bar, how to fix please,
sr my bad english :v

@ryanheise after some checking of old commits the iOS Command Center broke at commit 7578d7745ee7546302f5f7317575be9740279518

The change that broke it is changing flutter_tts: ^0.7.0 to flutter_tts: ^0.8.5. Reverting to the old version fixes the Example App.

I tested on a Physical iPhone 11 Pro Max iOS 13.3.1 with Flutter:

[✓] Flutter (Channel stable, v1.12.13+hotfix.8, on Mac OS X 10.15.1 19B88, locale en-US)
[✓] Android toolchain - develop for Android devices (Android SDK version 28.0.3)
[✓] Xcode - develop for iOS and macOS (Xcode 11.3.1)
[✓] Android Studio (version 3.5)
[✓] VS Code (version 1.42.0)

@YaarPatandarAA , that's certainly interesting, thanks for isolating the specific plugin and version.

I don't see anything obviously problematic with flutter_tts 0.8.5's code, though. Perhaps it is related to its build/configuration files.

Note: This particular issue is now being tracked in #202 .

I am closing this "iOS Support" issue finally, now that it has for the most part served its original purpose. From now on I would encourage people to open new issues for specific iOS features or bugs. It almost feels sad to close it after such a long journey, and with all of the suggestions and contributions from you all. It was a true community effort. Thank you!

Not sure if this counts as Apple CarPlay support but here is this package working on Apple CarPlay. The shown buttons also work, this screen is under Now Playing app on Apple CarPlay. I know there is way more that can be done with Apple CarPlay than just this, but it's a start. 👍🏻

20200423_194851

@YaarPatandarAA can you tell me what this package is? It's just bold not hyperlinked 😄

@YaarPatandarAA can you tell me what this package is? It's just bold not hyperlinked 😄

This package refers to this package on which we are commenting on, in this packages issue.
https://pub.dev/packages/audio_service

@YaarPatandarAA Thanks for clearing that! I was reading it in a completely different way 👍

Ah, that makes perfect sense :-) I also had the same misunderstanding as @snaeji

This plugin might now actually have better support for Apple CarPlay than for Android Auto, which is an interesting twist.

Hey there. I am working on a project which requires bookmarking functionality for both iOS and Android. I am wondering, has anyone managed to add "bookmarkCommand" for iOS?
I would be very thankful if there is anyone who did that and can share their experience.

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs, or use StackOverflow if you need help with audio_service.