Hello, I'm working on app that consumes multiple HLS streams (each a different camera angle), however I am unable to get the streams to play in sync. My current procedure works flawlessly for playing a recorded session (mp4), but live sessions (m3u8) do not even attempt to sync.
Current Procedure
1. Create a clock to synchronize to
CMClockRef syncClock;
CMAudioClockCreate(kCFAllocatorDefault, &syncClock);
2. Initialize players
// example HLS stream
NSURL *url = [NSURL URLWithString:@"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
NSMutableArray *players = [NSMutableArray array];
// create NUM_STREAMS AVPlayers
for (NSUInteger i = 0; i < NUM_STREAMS; i++) {
// initialize player with example url, and set master clock to our sync clock
AVPlayer *player = [AVPlayer playerWithURL:url];
player.masterClock = syncClock;
// add the player to the players array
[players addObject:player];
}
3. Play in sync
for (AVPlayer *player in players) {
[player setRate:1.0 time:kCMTimeInvalid atHostTime:CMClockGetTime(syncClock)];
}
If anyone has any suggestions please let me know! 😀