-
-
Notifications
You must be signed in to change notification settings - Fork 480
Tutorial
This tutorial walks you through the steps to create a Flutter app that uses audio_service
to play audio in the background.
You should already have followed the project setup instructions in the audio_service
one-isolate README.
This plugin encapsulates all of your audio code in an object called an audio handler. This object handles requests to perform media actions such as play
, pause
and seek
. These requests are handled in a uniform way from potentially multiple clients such as the Flutter UI, the notification, a smart watch, a headset, or Android Auto. When your Flutter UI is absent, either in the background or with the screen off, your audio handler may still continue to respond to these other clients.
You define an audio handler by creating a subclass of BaseAudioHandler
and overriding the methods for the media actions that your app wants to handle. Some common methods to override include:
play
pause
seek
skipToNext
skipToPrevious
stop
Your implementations of these methods decides what audio your app will play, whether that is music, a podcast, a text-to-speech reading of an audio book, or even voice instructions by a fitness trainer app.
Your audio handler is also responsible for broadcasting state changes to its clients so that they can update their UI accordingly. For example, your audio handler is responsible for broadcasting whether playback is currently in the playing
or paused
state, whether audio is currently buffering, and metadata (title, artist, duration) about the currently playing media item. The state information your audio handler broadcasts will be used by clients such as the notification and lock screen to display correct metadata and playback state to the user, as well as by your own Flutter UI in app. Your audio handler broadcasts state changes via streams. The more common streams for broadcasting state are:
playbackState
queue
mediaItem
You add an event to a stream (i.e. broadcast an event) by using code such as queue.add(myCurrentMediaItem)
where myCurrentMediaItem
is an object containing metadata about the current item being played (title, artist, etc.). You should broadcast to mediaItem
whenever the currently playing item changes, you should broadcast to queue
whenever the playlist changes, and you should broadcast to playbackState
when the state of playback changes (e.g. playing
vs paused
). Broadcasting this information allows, for example, the notification to display the correct metadata and correct state for its buttons.
Let's start with a simple app that plays a single mp3 file with a button to play and another button to pause.
During your app's initialisation, you initialise audio_service with your audio handler:
late AudioHandler _audioHandler; // singleton.
Future<void> main() async {
_audioHandler = await AudioService.init(
builder: () => AudioPlayerHandler(),
config: AudioServiceConfig(
androidNotificationChannelId: 'com.mycompany.myapp.channel.audio',
androidNotificationChannelName: 'Audio playback',
androidNotificationOngoing: true,
),
);
runApp(MyApp());
}
This initialisation code provides your audio handler class, which will be AudioPlayerHandler
, and the resulting instantiated handler object is stored into a global singleton called _audioHandler
(although in your app you may instead prefer to use a service locator or dependency injection.)
Your audio handler class is defined as a subclass of BaseAudioHandler
:
import 'package:just_audio/just_audio.dart';
class AudioPlayerHandler extends BaseAudioHandler {
final _player = AudioPlayer();
AudioPlayerHandler() {
_player.setUrl("https://exampledomain.com/song.mp3");
}
@override
Future<void> play() => _player.play();
@override
Future<void> pause() => _player.pause();
}
This simple handler "handles" only the play
and pause
media actions, and it does so by asking the just_audio _player
object to play/pause the audio it has loaded.
With the audio logic out of the way, let's now define our user interface with two buttons for playing and pausing:
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(title: 'Example', home: MainScreen());
}
}
class MainScreen extends StatelessWidget {
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: Text("Example")),
body: Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
ElevatedButton(child: Text("Play"), onPressed: _audioHandler.play),
ElevatedButton(child: Text("Pause"), onPressed: _audioHandler.pause),
],
),
),
);
}
}
You are free to implement the audio handler methods in any way that is appropriate for your app. For example, the following alternative audio handler will use text-to-speech to speak the numbers 1 to 10:
import 'package:flutter_tts/flutter_tts.dart';
class AudioPlayerHandler extends BaseAudioHandler {
final _tts = FlutterTts();
bool _finished = false;
@override
Future<void> play() async {
for (var n = 1; !_finished && n <= 10; n++) {
_tts.speak("$n");
await Future.delayed(Duration(seconds: 1));
}
}
@override
Future<void> stop() async {
// Stop speaking the numbers
_finished = true;
}
}
Note: stop
is another media action that stops audio playback completely so that it can't be resumed from the current position.
That's it!
There's one problem with the previous example: The 2 buttons are always visible. Ideally, the visible buttons should reflect the current state of the background audio task. Let's modify the example to manage state changes.
In the following code, we will update the audio handler to broadcast 3 different kinds of state:
-
playing
(true or false) -
processingState
(connecting, ready to play or stopped) -
controls
(the set of controls visible in the iOS control center and Android notification)
Let's update the audio handler to broadcast appropriate state changes. First, in the constructor, let's broadcast that we're loading the audio:
class AudioPlayerHandler extends BaseAudioHandler {
final _player = AudioPlayer();
@override
AudioPlayerHandler() {
// Broadcast that we're loading, and what controls are available.
playbackState.add(playbackState.value.copyWith(
controls: [MediaControl.play],
processingState: AudioProcessingState.loading,
));
// Connect to the URL
_player.setUrl("https://exampledomain.com/song.mp3").then((_) {
// Broadcast that we've finished loading
_playbackState.add(playbackState.value.copyWith(
processingState: AudioProcessingState.ready,
);
});
}
Note that for the parameter to playbackState.add()
we use the playbackState.value.copyWith()
method to create a copy of the current state while only specifying the parts of the state that have changed. At the beginning, the processingState
will initially pass through a loading
state before arriving at ready
-to-play. We also broadcast that we initially want just a MediaControl.play
button to be available in the user interface. This request will be respected by the system notification, and we will later also update our own Flutter UI to do something similar.
Next, in the play
and pause
methods, let's broadcast that the playing
state has changed. When we're playing, we want only the pause
button to be available, and while we're paused, we want only the play
button to be available:
@override
Future<void> play() async {
playbackState.add(playbackState.value.copyWith(
playing: true,
controls: [MediaControl.pause],
));
await _player.play();
}
@override
Future<void> pause() async {
playbackState.add(playbackState.value.copyWith(
playing: false,
controls: [MediaControl.play],
));
await _player.pause();
}
}
All clients will now respect these state changes, except for the Flutter UI itself for which we'll need to write our own code to make that happen. To do that, we will listen to the playbackState
stream. Flutter provides an easy way to make a widget responsive to the changing values of a stream using StreamBuilder
:
class MainScreen extends StatelessWidget {
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: Text("Example")),
body: Center(
child: StreamBuilder<PlaybackState>(
stream: _audioHandler.playbackState,
builder: (context, snapshot) {
final playing = snapshot.data?.playing ?? false;
final processingState = snapshot.data?.processingState
?? AudioProcessingState.idle;
return Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
if (playing)
ElevatedButton(child: Text("Pause"), onPressed: _audioHandler.pause)
else
ElevatedButton(child: Text("Play"), onPressed: _audioHandler.play),
],
);
},
),
),
);
}
}
In the above example, we show either the "play" or the "pause" button depending on the current state, but never both at the same time.
When the app has finished playing audio, it is good practice to release any system resources held for audio playback, and you may also want to deactivate the system notification. The most appropriate media action to handle this case is stop
which can be implemented in your audio handler as follows:
class AudioPlayerHandler extends BaseAudioHandler {
...
@override
Future<void> stop() async {
// Release any audio decoders back to the system
await _player.stop();
// Set the audio_service state to `idle` to deactivate the notification.
playbackState.add(playbackState.value.copyWith(
processingState: AudioProcessingState.idle,
));
}
}
We have now come full circle. In the Flutter UI, we define buttons that send requests to the audio handler. The audio handler plays audio and modifies playback according to those requests it receives. The audio handler also broadcasts state changes back to the Flutter UI, where the UI updates itself according to the new state.
By making your audio handler the single source of truth for state and the single responsible party for audio logic, your Flutter UI will also update correctly in response to pressing the play/pause buttons on any client: on your headset, your smart watch, and other compatible UIs.
For a more complete example with a queue, skip buttons and seeking, you are encouraged to refer to audio_service/example
.