WebRTC Video Calling with Flutter
Introduction
In this course, we have learned the various concepts and architectures of WebRTC and how it standardizes video calling between various devices and frameworks. Now, it’s our turn to implement it on each framework.
WebRTC on Flutter is usually implemented through the flutter_webrtc library, which has the requisite WebRTC code for all platforms that Flutter supports. The plugin abstracts away several difficult-to-implement parts of WebRTC, and the app built in this article is based on the example code given in the plugin.
In this tutorial, we’ll add a WebRTC-based calling solution to a Flutter application.
Setting Up the flutter_webrtc Plugin
Various components must be set up to facilitate a full video-calling experience. The first one is adding the base WebRTC plugin to your Flutter app. In this lesson, we only focus on Android and iOS, but please note that additional setup may be needed to set up similar experiences on other platforms.
First up, add the plugin to your pubspec.yaml file:
For iOS, we need to let the platform know that we will use the microphone and camera for calls through the Info.plist
file:
Similarly, for Android, we declare the same in the AndroidManifest.xml
file:
You can also add additional permissions in order to use Bluetooth devices:
Additionally, you must set your app to target Java 8 through the app-level build.gradle
file for Android:
Adding a WebSocket
Any WebRTC-based application needs to communicate with a plethora of servers and components. To do this, we must first set up a WebSocket that allows us to connect to these servers.
Here is a class that establishes a connection with the server mentioned above via the URL provided:
Signaling Server
While the overall WebRTC framework standardizes several aspects of creating a video calling experience, the one thing it leaves out is the signaling server. When creating a calling platform, you need to set up your own signaling server in order to connect devices. We go through the process of creating your own signaling server in a previous lesson. For this lesson, for the sake of brevity, we will use the signaling server that the flutter_webrtc plugin provides.
The signaling server project deals with several connection aspects in WebRTC, not just basic signaling. The class in our project dealing with the connection to the signaling server must create a WebSocket and listen for changes as well as any messages sent over the socket:
The entire class that connects and manages the signaling server connection is larger and is simplified here but you can view in full in the project repository.
Transferring data between peers
When a peer (a device on the network) wants to establish a call with another, it needs to create and send an offer to the other. It also needs to specify the session description, which defines several details about the potential call. See the lesson on the Session Description Protocol (SDP) for more information.
Once the other peer receives the offer from the first, it needs to accept or reject the offer. To do this, it creates an answer and sends it to the first peer. If accepted, they can initiate a call session and start exchanging data.
We must monitor any messages on the socket, such as offers, answers, peers, candidates, and more. Based on the data received, we can update local information on each device and initiate calls:
Building the video renderer
Once we receive the data stream from other peers, we can start displaying all users' videos. To do this, we create an RTCVideoRenderer which represents a single user’s video stream data. In this call between two users, we create a local renderer and a remote renderer representing the two participants.
We can then listen to remote and local stream changes using onAddRemoteStream and onLocalStream respectively (part of the signaling class in the project). These renderers can then be passed along to the RTCVideoView widget in order to be displayed:
Adding Controls
Simply creating a video feed isn’t enough. We need to add controls for the users to turn their input devices, such as the camera and microphone, on or off. The flutter_webrtc
plugin helps us via the Helper
class which allows us to easily switch cameras. We can disable the audio stream directly to enable or disable the microphone.
The Helper
class also enables you to do several other things, such as selecting an audio input/output device, enabling/disabling speakerphone, and more:
And that’s it! Combining these aspects creates a simple, minimal video-calling experience with WebRTC and Flutter.
The Stream Video Flutter SDK
While this guide provides a hands-on approach to building video streaming capabilities from scratch using WebRTC and Flutter, we understand the complexity and time investment required to create a robust real-time communication solution. For developers seeking a more streamlined, ready-to-use option that minimizes integration efforts, Stream's Video SDK for Flutter might be a suitable option.
To kickstart your development with Stream's Video SDK for Flutter, we recommend exploring the following tutorials tailored to specific use cases:
- Video-Calling App Tutorial: Ideal for developers aiming to integrate video-sharing experience into their Flutter applications.
- Audio-Room Tutorial: Perfect for those interested in creating any form of audio interaction.
- Livestreaming Tutorial: Demonstrates creating your own live-streaming experience.
Conclusion
You can find the full tutorial on the Stream Github here.
In this tutorial, we've taken a deep dive into implementing real-time video streaming in Flutter using WebRTC, focusing on the practical steps necessary to set up a peer-to-peer video-sharing connection. With this knowledge, you're now equipped to explore the vast potential of interactive media in your Flutter projects.