Difference in Siri/Simulator Remote

Along with the issue I filed on the video shuttle problem, I have run into an issue where the Siri remote and the simulated remote don't work the same. I have an AVPlayerViewController as a child of another view controller, the latter presenting a table with a movie jump list which could be chapters or, in our case, a table from an uploaded JSON file. The table is not initially shown, and sits on top of the container view for the video.


The intent is to use a UILongPressGestureRecognizer, attached to the UIPlayerViewController's view to unhide and focus on the jump table. Then another UILongPressGestureRecognizer attached to the table's containing view to re-hide the table. This works in the simulater, but not on the devide. If I change the code to intually show the table, the longPress will hide it, but I can't get it back. So I then paired the Siri remote to my computer, and found that when running the simulator it now behaved the same as on the device. So, the difference appears to be an issue when using the Siri remote. LongPress does work to dismiss the table, but not to show it, unless done on the simulated remote. So the problem is not the remote itself, but somewhere in the interpretation of the command.


I suppose I could open a view with the jump table, then to a new scene with the player, using the menu button to return to the jump table scene. However, that defeats the whole purpose of letting the user immediately see the results of the selection to see if that's really where he/she wanted to go. The table only occupies part of the screen and is semi-transparent.


Is anyone using chapter lists navigation or something similar? How do you handle this? There has to be someway to bring up the list, and subclassing AVPlayerViewController is verboten.

25467847 AddedsGestures don't work with Siri Remote in AVPlayerViewController


Filed the above.The tvOS videos even show an example of doing this. We tried theirs, and it still didn't work.


It appears that Apple, Inc. had tried to cover all eventualities with this controller, including chapter lists, etc. However, one thing I've learned is that no matter how you think the user is going to use your tools, they will want something different. In our case, Chapter Lists are not adequate, Because of the nature of our videos (educational courses), we prefer a two or three level outline instead, to make particular subject topics easily locatable. In fact, this is a feature I would like to see Apple, Inc adopt in their WWDC videos. When you need a quick look or review, it would be nice to be able to go directly to the topic or subtopic of interest in the video.

We did manage to get the UITapGestureRecognizer, described in the Apple TV developer movie, to work when the movie was playing, but not when it was paused. My (correct) theory was that the controls are in their own focused view, preventing the guesture from working while paused. Gestures do not participate in the Response or Focus chains. They attach and operate with their assigned view only.


We never got the longPress gestures working with the Siri Remote, even though they work with the simulator. But we did find a way to make the TapGesture work. This may not be Kosher and may not pass muster with the review team, but here is what we did. We looked at subviews of AVPlayerViewController's view, and realized that there was a childViewController for the shuttle/TrickView controls. So, we accessed that childViewController, removed the existing gestureRecognizer, (which apparently was a Menu Tap that restarted play), then added another TapGesture to it's view. Now, the TapGesture brings up our jump outline view whether playing or paused.


This actually uses only public interfaces, i.e. "playerViewController.childViewControllers.first.view", where we attach the additional gesture recognizer. So we are hoping that will go through. We added this to the above bugReport.


Additional info for anyone else trying to implement something like this:


When your main viewController gets the gesture callback to show the overlay view, you need to call setNeedsFocusUpdate(), In preferredFocusView you need to conditionally return the overlayView when it is not hidden, so it will get focus and let you navigate the table.

Difference in Siri/Simulator Remote
 
 
Q