How does Final Cut Camera synchronize videos

I have an application that enables recording video from multiple iPhones through an iPad. It uses Multipeer Connectivity for all the device communication. When the user presses record on the iPad, it sends a command to each device in parallel and they start capturing video. But since network latency varies, I cannot guarantee that the recording start and stop times are consistent among all the iPhones. I need the frames to be exactly in sync.

I tried using the system clock on each device for synchronizing the videos. If all the device system clocks were in sync within 3ms (30 frames per second), then it should be okay. But I tested and the clocks vary quite a bit, multiple seconds. So that won't work.

I ultimately solved the problem by having a countdown timer on the iPad. The user puts the iPad in view of each phone with the countdown. Then later I use a python script to cut all the videos when the countdown timer goes to 0. But that's more work for the end user and requires manual work on our end. With a little ML text recognition, this could get better.

Some people have suggested using a time server and syncing the clocks that way. I still haven't tried this out, and I'm not sure if it's even possible to run a NTP server on an iPad, and whether the NTP resolution will be below 3ms.

I tried out Final Cut Camera and it has solved the synchronization problem. Each frame is in sync. The phones don't start and stop at exactly the same time, and they account for this by adding black frames to the front and/or back of videos to account for differences.

I've searched online and other people have the same problem. I'd love to know how Apple was able to solve the synchronization issue when recording video from multiple iPhones from an iPad over what I assume is Multipeer Connectivity.

This is not an easy problem to solve. Syncing clocks across a distributed system is on ongoing source of active academic research. The Fount of all Knowledge™ has some useful background.

I ultimately solved the problem by having a countdown timer on the iPad. The user puts the iPad in view of each phone with the countdown.

Right. There are good reasons why this works:

  • The signal latency is the same for all iPhones, that is, the speed of light in air.

  • That speed is fast enough that it won’t vary significantly when you pan the iPhone from the iPad to the real subject.

  • The latency of the iOS camera processing system doesn’t matter. You don’t care how long it takes for the frame to traverse the camera, because you sync things based on the frame’s content.

Honestly, this sounds like a great solution to me, once you automate the sync step. There’s a reason the movie industry still uses clapperboards (-:


In terms of how to do this with networking code, I’d like to clarify one point. Do you only care about syncing the iPhones? Or is the iPad also recording video?

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"

How does Final Cut Camera synchronize videos
 
 
Q