Instantly Send Audio Messages With Stream Chat and Flutter

Gordon H.
Deven J.
Gordon H. & Deven J.
Published September 8, 2021 Updated April 2, 2024
Sending audio messages with Stream Chat and Flutter

Many chat applications today allow users to send voice notes as messages. In this tutorial, you’ll learn how to send voice notes, or audio attachments, in your Stream Chat Flutter app. By the end, your app will feature a chat experience similar to the shown here.

This tutorial will cover the following sections in detail:

  • Set Up Your Stream Account
  • Create Demo User Accounts
  • Disable Authentication for Development
  • Set Up Your Flutter Account
  • Create a Channel List Page
  • Create a Channel Page to View Messages
  • Add a Custom Action Widget to Record a Voice Note
  • Add a Custom Attachment Builder

Set Up Your Stream Account

To get started, you’ll need a Stream account to access the Stream Chat API. If you don’t have a Stream account already, you can sign up for a free 30-day trial.

If you’re working on a personal project or own a small business, you can register for a Stream Maker Account and access Stream Chat for free indefinitely.

Stream registration page

After creating your account, create an app and name it:

  1. Go to your Stream dashboard.
  2. Select Create App.
  3. Enter an App Name (like Audio Attachment Demo).
  4. Set your Server Location.
  5. Set the Environment to Development.
  6. Select Create App.
Create an app form in the Stream dashboard

After creating your app, you should see it listed in your Stream dashboard with your app’s respective API KEY and Secret.

Audio Attachment Demo with API key and Secret listed in Stream dashboard

Your API Key is only an app identifier and safe to share publicly. Your Secret helps generate authenticated user tokens and should be kept private.

From this dashboard, you can edit the app, access data, and create new apps.

Create Demo User Accounts

Stream offers many methods to create user accounts. In a production environment, you’d ideally manage user account creation and token generation server side.

However, for demo purposes, it’s easier to create accounts on your Stream dashboard.

To create demo accounts for your Steam app:

  1. Go to your Audio Attachment Demo app.
  2. Select Options.
  3. Select Open in Chat Explorer. (This will direct you to the Explorer dashboard, where you can create channels and users.)
Opening Chat Explorer in the Stream dashboard
  1. Select Users, then Create New User.
  2. In the Create New User window, enter a User Name and User Id.
  3. In the User Application Role dropdown menu, select user.
Creating a new user in the Chat Explorer window

You can create as many users as you’d like for your demo app.

Disable Authentication for Development

Any user accounts you create will require authentication tokens to access the Stream API. For demo purposes, you should disable these authentication checks and use developer tokens instead.

To disable authentication for your app:

  1. Go to your Audio Attachment Demo app.
  2. In the dashboard nav menu, select Chat.
  3. From the Chat dropdown, select Overview.
  4. Scroll to the Authentication section.
  5. Select the Disable Auth Checks toggle button.
Disabling Auth Checks for Stream Chat app users

If you don’t want to disable authentication, you can easily generate tokens using Stream’s User JWT Generator.

Generating JWTs with Stream's JWT Generator

⚠️Note: In a production scenario, you must generate a token using your server and one of Stream's server SDKs. You should never hardcode user tokens in a production application.

Set Up Your Flutter App

If you’re new to Stream, see the Stream Flutter Chat tutorial for a more thorough introduction to all the basic components available to you.

Otherwise, go ahead and create a new Flutter application from your terminal or preferred IDE with the following command:

dart
1
Flutter create audio_attachment_tutorial

Note: This tutorial was tested using version 3.19.4 of Flutter.

Open the project and add the following code to your pubspec.yaml file:

dart
1
2
3
just_audio: ^0.9.36 record: ^4.4.3 stream_chat_flutter: ^7.1.0

Note: Future versions may have breaking changes. To follow this tutorial we recommend using these versions.

Create a config.dart file with the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
// Stream config import 'package:flutter/material.dart'; const streamKey = 'YOUR_KEY'; // TODO: Enter your Stream Application key here const userGordon = DemoUser( id: 'gordon', name: 'Gordon Hayes', image: 'https://avatars.githubusercontent.com/u/13705472?v=4', ); const userSalvatore = DemoUser( id: 'salvatore', name: 'Salvatore Giordano', image: 'https://avatars.githubusercontent.com/u/20601437?v=4', ); class DemoUser { final String id; final String name; final String image; const DemoUser({ required this.id, required this.name, required this.image, }); }

In the snippet above, you:

  1. Set your unique App Key to streamKey. (You can get this unique key from your app's dashboard on Stream.)
  2. Created a DemoUser model to store user information.
  3. Created two demo users. These should use the same ids that you set on your Stream dashboard. (Note that you’re hardcoding a name and image; ideally, these values should be set using your server and one of Stream's server SDKs.)

Replace the code inside main.dart with the following:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
import 'package:flutter/material.dart'; import 'package:stream_chat_flutter/stream_chat_flutter.dart'; import 'channel_list_page.dart'; import 'config.dart'; void main() { WidgetsFlutterBinding.ensureInitialized(); final client = StreamChatClient(streamKey); runApp(MyApp(client: client)); } class MyApp extends StatelessWidget { const MyApp({Key? key, required this.client}) : super(key: key); final StreamChatClient client; Widget build(BuildContext context) { return MaterialApp( builder: (context, widget) { return StreamChat( child: widget!, client: client, ); }, debugShowCheckedModeBanner: false, home: const SelectUserPage(), ); } } class SelectUserPage extends StatelessWidget { const SelectUserPage({Key? key}) : super(key: key); Widget build(BuildContext context) { return MaterialApp( home: Scaffold( body: Center( child: Column( mainAxisAlignment: MainAxisAlignment.center, children: const [ Padding( padding: EdgeInsets.all(8.0), child: Text( 'Select a user', style: TextStyle(fontSize: 24), ), ), SelectUserButton(user: userGordon), SelectUserButton(user: userSalvatore), ], ), ), ), ); } } class SelectUserButton extends StatelessWidget { const SelectUserButton({ Key? key, required this.user, }) : super(key: key); final DemoUser user; Widget build(BuildContext context) { return ElevatedButton( onPressed: () async { final client = StreamChat.of(context).client; await client.connectUser( User( id: user.id, extraData: { 'name': user.name, 'image': user.image, }, ), client.devToken(user.id).rawValue, ); Navigator.of(context).pushReplacement( MaterialPageRoute(builder: (context) => const ChannelListPage()), ); }, child: Text(user.name), ); } }

In the above snippet, you:

  1. Instantiated a StreamChatClient using Stream’s Flutter SDK.
  2. Created a MyApp widget with the home attribute set to SelectUserPage, and wrapped the application with a builder that creates a StreamChat widget, which handles a lot of the chat logic out of the box.
  3. Created the SelectUserPage and SelectUserButton widgets, which show two demo accounts to select from.
  4. Created an onPressed handler to connect a user using the Stream client. (You can only call client.devToken(user.id) - devToken if you disabled Authentication.)
  5. Navigate the user to the ChannelListPage after connecting them. The ChannelListPage lists all the channels for your Stream app.
Audio Attachment app profile page

Create a Page to List All Channels

Building your own app? Get early access to our Livestream or Video Calling API and launch in days!

Next, you’ll display a list of channels where the current user is a member. It’s better to separate the code into multiple files so that it’s easier to maintain as the code grows.

Create a new file called channel_list_page.dart and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
import 'package:audio_attachment_tutorial/config.dart'; import 'package:audio_attachment_tutorial/main.dart'; import 'package:flutter/material.dart'; import 'package:stream_chat_flutter/stream_chat_flutter.dart'; import 'channel_page.dart'; class ChannelListPage extends StatefulWidget { const ChannelListPage({ Key? key, }) : super(key: key); State<ChannelListPage> createState() => _ChannelListPageState(); } class _ChannelListPageState extends State<ChannelListPage> { late final _controller = StreamChannelListController( client: StreamChat.of(context).client, filter: Filter.in_('members', [StreamChat.of(context).currentUser!.id]), channelStateSort: const [SortOption('last_message_at')], limit: 30, ); void dispose() { _controller.dispose(); super.dispose(); } Widget build(BuildContext context) { return Scaffold( appBar: AppBar( title: const Text('Stream Chat'), actions: [ GestureDetector( onTap: () async { await StreamChat.of(context).client.disconnectUser(); Navigator.of(context).pushReplacement( MaterialPageRoute(builder: (context) => const SelectUserPage()), ); }, child: const Padding( padding: EdgeInsets.all(8.0), child: Center( child: Text('Switch user'), ), ), ) ], ), body: StreamChannelListView( controller: _controller, emptyBuilder: (conext) { return Center( child: ElevatedButton( onPressed: () async { final channel = StreamChat.of(context).client.channel( "messaging", id: "test-gordon", extraData: { "name": "Flutter Chat", "image": "https://flutter.dev/assets/images/shared/brand/flutter/logo/flutter-lockup.png", "members": [userGordon.id, userSalvatore.id] }, ); await channel.create(); }, child: const Text('Create channel'), ), ); }, onChannelTap: (channel) => Navigator.push( context, MaterialPageRoute( builder: (_) => StreamChannel( channel: channel, child: const ChannelPage(), ), ), ), ), ); } }

In the snippet above, you:

  1. Created a Scaffold with the body set to the list of channels and a button in the actions attribute to disconnect the current user and navigate back to the SelectUserPage.
  2. Created a StreamChannelListView widgets (provided by the Stream package) that will display all channels for your Stream app. This widget requires a StreamChannelListController, which is created and disposed in the stateful widget.
  3. Created an emptyBuilder, which will return if there are no channels for your app. In the emptyBuilder, you return a button that creates a new group channel and sets the members to the users you created.
  4. Provided the onChannelTap behaviour, which will open the ChannelPage (a custom widget you’ll create next).
  5. Added a filter to only show channels where the current user is a member.
  6. Added sort and limit, which you can customize as needed.
Channels list page

Create a Channel Page to View Messages

To display the message list view, create a new file called channel_page.dart and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
import 'package:flutter/material.dart'; import 'package:stream_chat_flutter/stream_chat_flutter.dart'; class ChannelPage extends StatefulWidget { const ChannelPage({ Key? key, }) : super(key: key); _ChannelPageState createState() => _ChannelPageState(); } class _ChannelPageState extends State<ChannelPage> { Widget build(BuildContext context) { return Scaffold( appBar: const StreamChannelHeader(), body: Column( children: <Widget>[ Expanded( child: StreamMessageListView(), ), StreamMessageInput(), ], ), ); }

In this snippet, you:

  1. Created a Scaffold for the new page.
  2. Set the appBar to be a StreamChannelHeader (this shows the channel name and image).
  3. Created a Column with an expanded StreamMessageListView (a list that displays all channel messages, images, and custom attachments).
  4. Created a StreamMessageInput at the bottom of the Column, which is used to send new messages and attachments to the channel.

If you run the app now, you’ll find–with only a small amount of code–that you have a pretty robust messaging app complete with all the necessary functionality.

Now, you’re at the point where you can add the functionality to support audio messaging.

Add a Custom Action Widget to Record a Voice Note

To support audio messaging, you’ll add a custom action widget so users can record a voice note and send it as a message. At the end of this section, your widget should look like this:

Custom action widget

In your MessageInput provide the following custom action:

dart
1
2
3
4
5
6
7
8
9
StreamMessageInput( actionsBuilder: (context, list) { return [ RecordButton( recordingFinishedCallback: _recordingFinishedCallback, ), ]; }, ),

You’ll create the _recordingFinishedCallback method later. First, create a new file called record_button.dart and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
import 'package:flutter/material.dart'; import 'package:record/record.dart'; import 'package:stream_chat_flutter/stream_chat_flutter.dart'; typedef RecordCallback = void Function(String); class RecordButton extends StatefulWidget { const RecordButton({ Key? key, required this.recordingFinishedCallback, }) : super(key: key); final RecordCallback recordingFinishedCallback; _RecordButtonState createState() => _RecordButtonState(); } class _RecordButtonState extends State<RecordButton> { bool _isRecording = false; final _audioRecorder = Record(); Future<void> _start() async { try { if (await _audioRecorder.hasPermission()) { await _audioRecorder.start(); bool isRecording = await _audioRecorder.isRecording(); setState(() { _isRecording = isRecording; }); } } catch (e) { print(e); } } Future<void> _stop() async { final path = await _audioRecorder.stop(); widget.recordingFinishedCallback(path!); setState(() => _isRecording = false); } Widget build(BuildContext context) { late final IconData icon; late final Color? color; if (_isRecording) { icon = Icons.stop; color = Colors.red.withOpacity(0.3); } else { color = StreamChatTheme.of(context).primaryIconTheme.color; icon = Icons.mic; } return GestureDetector( onTap: () { _isRecording ? _stop() : _start(); }, child: Icon( icon, color: color, ), ); } }

In this widget, you:

  1. Created an instance of Record called _audioRecorder, which uses the Record package to make it easy to capture audio recordings in Flutter. (The record package requires some minimal iOS and Android set up to use; read the Flutter Package docs for more information.)
  2. Created _start and _stop methods to control the audio recording.
  3. Created a build method that uses a GestureDetector to start stop a recording.
  4. Used a RecordCallback type definition to send back the string path of the recorded file (called in the _stop method).

Go back to the channel_page.dart and create the _recordingFinishedCallback method in the _ChannelPageState class:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
void _recordingFinishedCallback(String path) { final uri = Uri.parse(path); File file = File(uri.path); file.length().then( (fileSize) { StreamChannel.of(context).channel.sendMessage( Message( attachments: [ Attachment( type: 'voicenote', file: AttachmentFile( size: fileSize, path: uri.path, ), ) ], ), ); }, ); }

When the recording is finished, the _recordingFinishedCallback will be called. It does the following:

  1. Parses the path to the URI.
  2. Creates a new File from the uri.path.
  3. Uses the then callback on file.length to handle the file length as it’s retrieved (the file length is needed to upload the attachment to Stream).
  4. Gets the current channel using StreamChannel.of(context).channel.
  5. Calls sendMessage on the channel and provides a Message with the Attachment.
  6. Sets the type to voicenote (this can be any identifier) and creates an AttachmentFile with the path and size of the file.

Now, you have the necessary functionality to record an audio file and upload it to Stream.

Add a Custom Attachment Builder

With the current code, the StreamMessageListView still doesn’t know how to render attachments with the type: ‘voicenote’. You must tell it how with the messageBuilder argument.

In channel_page.dart, change your StreamMessageListView to the following:

dart
1
2
3
4
5
6
7
8
9
10
11
12
StreamMessageListView( messageBuilder: (context, details, messages, defaultMessage) { return defaultMessage.copyWith( attachmentBuilders: [ AudioAttachmentBuilder(), ...StreamAttachmentWidgetBuilder.defaultBuilders( message: details.message, ), ], ); }, ),

The messageBuilder gets the defaultMessage, which has a copyWith method to override the attachmentBuilders for the list view. You're creating a custom builder named AudioAttachmentBuilder for the voicenote type (a type you specified). To do this, you can extend the StreamAttachmentWidgetBuilder class:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
class AudioAttachmentBuilder extends StreamAttachmentWidgetBuilder { Widget build(BuildContext context, Message message, Map<String, List<Attachment>> attachments) { final url = attachments['voicenote']?.first.assetUrl; late final Widget widget; if (url == null) { widget = const AudioLoadingMessage(); } else { widget = AudioPlayerMessage( source: AudioSource.uri(Uri.parse(url)), id: message.id, ); } return SizedBox( width: 250, height: 50, child: widget, ); } bool canHandle(Message message, Map<String, List<Attachment>> attachments) { final audioAttachments = attachments['voicenote']; return audioAttachments != null && audioAttachments.length == 1; } }

In the builder, you check to see that the first attachment's assetUrl isn’t null. If it is null, you return AssetLoadingMessage. Otherwise, you return AudioPlayerMessage and specify the AudioSource. (The AudioSource is a class that comes from the just_audio package and uses the attachment URL to load the audio.)

Next, create a new file called audio_loading_message.dart and add the following:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
import 'package:flutter/material.dart'; class AudioLoadingMessage extends StatelessWidget { const AudioLoadingMessage({ Key? key, }) : super(key: key); Widget build(BuildContext context) { return Padding( padding: const EdgeInsets.all(8.0), child: Row( mainAxisSize: MainAxisSize.min, crossAxisAlignment: CrossAxisAlignment.center, children: const [ SizedBox( height: 20, width: 20, child: CircularProgressIndicator( strokeWidth: 3, ), ), Padding( padding: EdgeInsets.only(left: 16.0), child: Icon(Icons.mic), ), ], ), ); } }

This code will show a loading indicator as the asset sends.

Finally, create a file called audio_player_message.dart and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
import 'dart:async'; import 'package:audio_attachment_tutorial/audio_loading_message.dart'; import 'package:flutter/cupertino.dart'; import 'package:flutter/material.dart'; import 'package:just_audio/just_audio.dart'; class AudioPlayerMessage extends StatefulWidget { const AudioPlayerMessage({ Key? key, required this.source, required this.id, }) : super(key: key); final AudioSource source; final String id; AudioPlayerMessageState createState() => AudioPlayerMessageState(); } class AudioPlayerMessageState extends State<AudioPlayerMessage> { final _audioPlayer = AudioPlayer(); late StreamSubscription<PlayerState> _playerStateChangedSubscription; late Future<Duration?> futureDuration; void initState() { super.initState(); _playerStateChangedSubscription = _audioPlayer.playerStateStream.listen(playerStateListener); futureDuration = _audioPlayer.setAudioSource(widget.source); } void playerStateListener(PlayerState state) async { if (state.processingState == ProcessingState.completed) { await reset(); } } void dispose() { _playerStateChangedSubscription.cancel(); _audioPlayer.dispose(); super.dispose(); } Widget build(BuildContext context) { return FutureBuilder<Duration?>( future: futureDuration, builder: (context, snapshot) { if (snapshot.hasData) { return Row( mainAxisSize: MainAxisSize.min, children: <Widget>[ _controlButtons(), _slider(snapshot.data), ], ); } return const AudioLoadingMessage(); }, ); } Widget _controlButtons() { return StreamBuilder<bool>( stream: _audioPlayer.playingStream, builder: (context, _) { final color = _audioPlayer.playerState.playing ? Colors.red : Colors.blue; final icon = _audioPlayer.playerState.playing ? Icons.pause : Icons.play_arrow; return Padding( padding: const EdgeInsets.all(4.0), child: GestureDetector( onTap: () { if (_audioPlayer.playerState.playing) { pause(); } else { play(); } }, child: SizedBox( width: 40, height: 40, child: Icon(icon, color: color, size: 30), ), ), ); }, ); } Widget _slider(Duration? duration) { return StreamBuilder<Duration>( stream: _audioPlayer.positionStream, builder: (context, snapshot) { if (snapshot.hasData && duration != null) { return CupertinoSlider( value: snapshot.data!.inMicroseconds / duration.inMicroseconds, onChanged: (val) { _audioPlayer.seek(duration * val); }, ); } else { return const SizedBox.shrink(); } }, ); } Future<void> play() { return _audioPlayer.play(); } Future<void> pause() { return _audioPlayer.pause(); } Future<void> reset() async { await _audioPlayer.stop(); return _audioPlayer.seek(const Duration(milliseconds: 0)); } }

This widget controls the audio playback for a voice note, allowing you to play, pause, and skip to different parts of the audio file with a slider.

For more information on how to use this package, see the just_audio package documentation.

By the end, your channel_page.dart imports should look similar to this:

dart
1
2
3
4
5
6
7
8
9
import 'dart:io'; import 'package:audio_attachment_tutorial/audio_loading_message.dart'; import 'package:flutter/material.dart'; import 'package:just_audio/just_audio.dart'; import 'package:stream_chat_flutter/stream_chat_flutter.dart'; import 'audio_player_message.dart'; import 'record_button.dart';

Wrapping Up

That's it 🎉! You should see a channel page like the one below that allows you to record and send voice notes:

Channel page

To see the full source code for this Stream Chat Flutter app, see the Stream Audio Attachment Tutorial GitHub.

There are several other Stream Flutter packages that provide various levels of UI and low-level chat control, including offline support and localization. See the Stream Chat Flutter GitHub for more information.

Lastly, you can also subscribe to the Stream Developers YouTube channel for more exclusive dev content.

Happy coding!

Integrating Video With Your App?
We've built a Video and Audio solution just for you. Check out our APIs and SDKs.
Learn more ->