Building a Full-Stack FaceTime Clone with SwiftUI

This article demonstrates and guides you in building a FaceTime clone using SwiftUI and the iOS Video SDK from Stream to have a face-to-face chat with friends and family.

Amos G.
Amos G.
Published July 11, 2023
FaceTime clone header image

Like many others, I enjoy using FaceTime to chat with family and friends. The app makes it easy to have real-time one-to-one or group audio and video conversations on any of Apple's devices.

As a user, it's often easy to forget the sheer complexity that lives under the surface of a seemingly common application like FaceTime.

To explore some underlying complexities of building a calling application, let's attempt to recreate Apple's magical FaceTime experience using Swift UI and Stream's Video API.

To get started, you will need a free account on Stream, a new XCode project, and a fresh cup of your favorite coffee ☕️!

Overview of Stream Video iOS

Components of the SDK

The Stream's iOS Video SDk consists of three separate SDKs, and these are:

  • UIKit SDK: A UIKit wrapper for SwiftUI components
  • Low-level client: A WebRTC implementation responsible for establishing calls
  • SwiftUI SDK: SwiftUI components for building call flows such as live streaming, drop-in audio, and voice/video calling.

The separation into multiple layers allows developers to choose a flavor of the SDK that best works for their use-case. For example, if you are building a simple calling or meetings application with little customizability, the high-level UI Kit makes the integration process as simple as copying a few lines of code and rebuilding your application.

For cases where the use-case requires lots of bespoke designs and behaviour, the low-level client provides API for developers to have a more granular control over the look and feel of their experience.

Project Setup

Image showing group calling

Let's begin by cloning the sample project from GitHub. We will use this project as a starting point for our integration. Since FaceTime is a relatively complex application, we've already started on some of the basics of the UI and the app.

Next, if you have not done so already, sign up and register for a free account to obtain a Stream API key. We will use the API key at later stages in the tutorial to help initalize the SDK and make our first API call. For detailed instruction on how to use the Dashboard, consider checking out our companion guide.

At the end of this blog post, users will be able to create and join a call from our app, similar to the Video below. With the boring stuff out of the way, let's dive into some code 🛠️!

Create a New SwiftUI Project

Launch Xcode and create a new SwiftUI application. Name the project as you want. This demo uses Face2Face as the project name.

Step 1: Fetch the SwiftUI Video SDK

Once you have a blank SwiftUI project, you can install the video SDK. Select File -> Add Packages (Note: Add Package Dependencies if you are using Xcode 15+) in the Xcode’s toolbar. Copy and paste this https://github.com/GetStream/stream-video-swift.git URL into the search bar and click Add Package. Follow the next instructions to complete the installation.

Understanding the Package Dependencies

The video SDK has the following packages as dependencies:

  • Nuke: It provides an efficient way to download and display images in the app
  • StreamVideo: The Stream Video SDK
  • SwiftProtobuf: Complements Google's Protocol ("protobuf") serialization technology
  • WebRTC: The WebRTC SDK offers support for establishing audio and video calls

Step 2: Set Privacies - Camera and Microphone Usage Descriptions

Since you are building an iOS calling app that involves access to the user's protected resources, such as microphone and camera, it is required to set privacies for camera and microphone usage. Read Setting Background Modes and Device Capability Privacies in iOS Apps to learn more.

To configure privacy to access the user's camera and microphone,

  1. Select the name of your app in the Xcode Project Navigator and click the Info tab.
  2. Click the project's name under Targets. The name of this app is Face2Face. However, yours may be different.
  3. Under the Key category, click the + button on the right side of any of the Key items and scroll to the privacy section.
  4. Click Privacy - Camera Usage Description and add a string that explains why your app needs camera access. This demo uses Face2FaceApp would like to access your camera as the string.
  5. Repeat step 4 above to add privacy for microphone access shown in the image below.
Set camera and microphone usage privacies

Step 3: Configure the Video SDK

  1. Create a User Object
    To access the video SDK and the SwiftUI components, you need a user to connect to the SDK's back end. The user can be authenticated, anonymous or guest. The user's credentials must be used to initialize the Stream Video client. In this demo, we provide you credentials of the user.
swift
1
2
3
4
5
let user = User( id: userId, name: "Martin", // name and imageURL are used in the UI imageURL: .init(string: "https://getstream.io/static/2796a305dd07651fcceb4721a94f4505/a3911/martin-mitrevski.webp") )
  1. Initialize the Stream Video Client
    Here, you should initialize the video client with the user, API key, and token. You can find the API key from your Stream dashboard account. If you are new to Stream, you can sign up for a new account.
Building your own app? Get early access to our Livestream or Video Calling API and launch in days!
swift
1
2
3
4
5
6
// Initialize Stream Video client self.client = StreamVideo( apiKey: apiKey, user: user, token: .init(stringLiteral: token) )
  1. Putting It All Together in Face2FaceApp.swift
    When integrating Stream Video with a production iOS app, always configure the Stream Video object very early in your application's lifecycle. You can set it up and initialize it as soon as possible in AppDelegate.swift's application:didFinishLaunchingWithOptions method for UIKit-based applications. SwiftUI apps do not implement AppDelegate.swift by default. Therefore, you should initialize the Stream Video object within the declaration of the [App conformer](https://developer.apple.com/documentation/swiftui/app/main() "App conformer"). This will make Stream Video immediately available when the app launches.

Since SwiftUI does not implement AppDelegate, we will put all the SDK's initialization and configuration in the app's conformer file Face2FaceApp.swift. That is YourAppName_App.swift.

swift
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
import SwiftUI import StreamVideo import StreamVideoSwiftUI @main struct VideoCallApp: App { @ObservedObject var viewModel: CallViewModel private var client: StreamVideo private let apiKey: String = "mmhfdzb5evj2" // The API key can be found in the Credentials section private let userId: String = "Jacen_Solo" // The User Id can be found in the Credentials section private let token: String = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyX2lkIjoiSmFjZW5fU29sbyIsImlzcyI6InByb250byIsInN1YiI6InVzZXIvSmFjZW5fU29sbyIsImlhdCI6MTY5MTEzMTkzMSwiZXhwIjoxNjkxNzM2NzM2fQ.g9OYWrBFfNPYhuVIEvmyyZFNxl7qApsAD2WixxmZhCg" // The Token can be found in the Credentials section let callId: String = "imirKfxKjuXC" // The CallId can be found in the Credentials section init() { let user = User( id: userId, name: "Martin", // name and imageURL are used in the UI imageURL: .init(string: "https://getstream.io/static/2796a305dd07651fcceb4721a94f4505/a3911/martin-mitrevski.webp") ) // Initialize Stream Video client self.client = StreamVideo( apiKey: apiKey, user: user, token: .init(stringLiteral: token) ) self.viewModel = .init() } var body: some Scene { WindowGroup { NavigationStack { ZStack { VStack { if viewModel.call != nil { CallContainer(viewFactory: DefaultViewFactory.shared, viewModel: viewModel) } else { NewFace2Face(viewModel: viewModel) } } } } } } }

Step 4: Add UI to Initiate a Call

  1. Create and Join a New Call
    After creating a user, you can create and join a call with the sample code below. If you want the call to start immediately after running the app, you can add the implementation below in your app's conformer file Face2FaceApp.swift described in the previous section. In this demo, we want the call to start with a tap gesture. So, let's add the "create and join a call" functionality in the next step.
swift
1
2
3
4
Task { guard viewModel.call == nil else { return } viewModel.joinCall(callType: .default, callId: callId) }
  1. Creating the Start Call UI
    For the sake of simplicity, you will need only the following home screen to establish the calling functionality for this demo.
Home screen interface

The home screen consists of a button to initiate the call and a blurred background that displays the live iOS device camera feed. The live iOS camera feed display on the home screen's background is not part of the video SDK. However, you can find the code for the camera feed in the Xcode project under the folder LiveCameraView.

Add a new file in the Project navigator to contain the composition of the home screen. In the demo Xcode project, you will find NewFace2Face.swift, but you can name yours as you want. Replace the content of the file you created with the code below.

swift
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
import SwiftUI import StreamVideoSwiftUI struct NewFace2Face: View { @ObservedObject var viewModel: CallViewModel private let callId: String = "imirKfxKjuXC" var body: some View { NavigationStack { ZStack { HostedViewController() .ignoresSafeArea() .blur(radius: 8) .blendMode(.plusLighter) VStack { HStack{ NavigationLink { } label: { VStack { Image(systemName: "link") Text("Create Link") .lineLimit(1) } .padding(EdgeInsets(top: 7, leading: 42, bottom: 7, trailing: 42)) } .buttonStyle(.plain) .background(.ultraThinMaterial) .cornerRadius(12) NavigationLink { // } label: { VStack { Image(systemName: "video.fill") Text("New Face2Face") .lineLimit(1) } .padding(.horizontal) .accessibilityAddTraits(.isButton) .onTapGesture { Task { guard viewModel.call == nil else { return } viewModel.joinCall(callType: .default, callId: callId) } } } .buttonStyle(.borderedProminent) } .padding(.bottom, 44) List { Section{ } header: { Text("Today") } NavigationLink { } label: { HStack { Image(systemName: "h.circle.fill") .font(.largeTitle) VStack(alignment: .leading) { Text("Harrison") HStack { Image(systemName: "video.fill") Text("Face2Face Video") } } Spacer() Text("12:03") } } } .scrollContentBackground(.hidden) } .padding() .navigationTitle("Face2Face") .toolbar { ToolbarItem(placement: .navigationBarLeading) { Button { } label: { Text("Edit") } } } } } } }
  1. Add a Button to Make an Outgoing Call
    In the code above, we attach the join call action to the button with a camera icon to initiate the call viewModel.joinCall(callType: .default, callId: callId).
Outgoing call overview

An outgoing call begins with a call intent from the user using the call recipient's information. The video SDK’s CallViewModel then creates a start call event using the joinCall method and publishes the call event to the system. The system publishes the event call event to the call object that displays information about the participants and the call.

swift
1
2
3
4
5
6
.onTapGesture { Task { guard viewModel.call == nil else { return } viewModel.joinCall(callType: .default, callId: callId) } }

Step 5: Run the App on Two iPhones

You should enable developer mode on each device to establish or test the call on two iOS devices.

  1. Launch the Settings app on your iOS device. This demo uses two iPhones
  2. Tap Privacy & Security.
  3. Scroll to the bottom to find Security. Then, select Developer Mode and toggle the Switch to on.
Enable deveper mode in settings on iPhone

Now, ensure to connect and select your device’s name from the available device options in Xcode. Run the app in Xcode on the selected device. Recompile it on the other device. Enter the same call ID on both devices to establish the call. Add, for example, abcd as the call ID on both devices and initiate the call. To avoid typing out the call ID each time, you can quickly fill the recipient text field by tapping the recipients under the Called Recently section.
Bravo!!. You now have a feature-rich and fully functional iOS/SwiftUI audio/video calling app that supports group calls, fullscreen mode, and picture-in-picture (PIP).

Two people in a video call

Test the App Using an iPhone and Our Web App

As the saying goes, seeing is believing. Let's take the app you built in this tutorial, run it on an iPhone and let other call participants join from the web. The Video SDK provides well connected and seamless testing experience by allowing you to preview your apps with iOS devices and the web. Follow the steps below to join multiple participants to call with the app you just built.

  1. First, copy the user credentials from the video call tutorial in our documentation. Run the app in Xcode on your iPhone.
  2. To join other call participants from the web, click the Join Call button below the user credentials in step 1 above.
User credentials

You can join as many call participants as you want using the web app. This is feasible because the call ID is the same when you run the app on an iPhone and on the web.

Test the App Using an iPhone and Our Web App

How to Start a New Group Call
During an active call, you can add more participants to take part in the active call. To have a group calling for the purpose of this demo, tap the back button < Face2Face in the leading toolbar and pick the same call recipient you initially called with. Alternatively, you can add an implementation in your app so that tapping the person.2 icon on the top right of the screen will invite several people to join the action.

Where Do I Go From Here?

Great work! Together we just created our first FaceTime clone using SwiftUI! To learn more about the SDK or some of the other applicatons which can be built using Stream, consider checking out some of our other resources:

Integrating Video With Your App?
We've built a Video and Audio solution just for you. Check out our APIs and SDKs.
Learn more ->