The Photos app on iOS is capable of playing Live Photos and showing images on another screen. My SwiftUI app is able to AirPlay videos and I can make the button appear or not using AVPlayer's isExternalPlaybackActive boolean. However, that boolean is not available for images and Live Photos.
I've tried using an AVRoutePickerView(), which gives me a button for the user to start/stop AirPlay, but I have not found a way to associate it with SwiftUI's Image or my LivePhotoView:
struct LivePhotoView: UIViewRepresentable {
var livephoto: PHLivePhoto
@Binding var playback:Bool
@AppStorage("playAudio") var playAudio = userDefaults.bool(forKey: "playAudio")
func makeUIView(context: Context) -> PHLivePhotoView {
let view = PHLivePhotoView()
try! AVAudioSession.sharedInstance().setCategory({playAudio ? .playback : .soloAmbient}())
view.contentMode = .scaleAspectFit
return view
}
func updateUIView(_ lpView: PHLivePhotoView, context: Context) {
lpView.livePhoto = livephoto
if playback {
lpView.isMuted = false
try! AVAudioSession.sharedInstance().setCategory({playAudio ? .playback : .soloAmbient}())
try! AVAudioSession.sharedInstance().setActive(true)
lpView.isMuted = false
lpView.startPlayback(with: .full)
} else {
lpView.stopPlayback()
}
}
}
struct RouteButtonView: UIViewRepresentable {
func makeUIView(context: Context) -> AVRoutePickerView {
let routePickerView = AVRoutePickerView()
routePickerView.tintColor = .gray
routePickerView.prioritizesVideoDevices = true
return routePickerView
}
func updateUIView(_ uiView: AVRoutePickerView, context: Context) {
// No update needed
}
}
Am I missing something? If it's a system internal API only available to the Photos App, why?
Mike
AirPlay 2
RSS for tagAirPlay 2 allows users to wirelessly send content from any Apple device to a device enabled with AirPlay.
Posts under AirPlay 2 tag
16 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I'm trying to make screen mirroring work with my own DNS by creating the static PTR SRV and TXT records.
I create all the necesary
b._dns-sd._udp lb._dns-sd._udp _services._dns-sd._udp and coresponding _airplay._tcp _raop._tcp records.
That seems to work as I can see the records in dns-sd -Z _airplay._tcp and dns-sd -Z _roap._tcp or in the Discovery Browser app
But for some sort of reason I cannot seem to see my displays in the Screen Mirroring menu
If anyone has any suggestions, I am all ears.
Hi Guys, I would like to ask if anyone knows the FPS of screen recording and airplay on Vision Pro. Airplay refers to mirroring the Vision Pro view to MacBook/iPhone/iPad. Also, is there any way to record the screen with the raw FPS of Vision Pro (i.e., 90)?
Hi,
I am using avplayer, want to stream airplay with my encrypted hls m3u8 url. I am sharing my code snippet.
` let contentUrl = URL(string: String(format:videoUrl))
let headers = ["token": token]
let asset: AVURLAsset = AVURLAsset(url: contentUrl!, options["AVURLAssetHTTPHeaderFieldsKey": headers])
let playerItem: AVPlayerItem = AVPlayerItem(asset: asset)
self.avPlayer?.replaceCurrentItem(with: playerItem)
self.avPlayer?.play()`
Airplay is not working on my tv when i start stream. My encrypted url won't work.
Is there any way to stream airplay with encrypted url. Stuck here........
I have an avplayer with an encrypted m3u8 url and this is my code snippet
let contentUrl = URL(string: String(format:videoUrl))
let headers = ["token": token]
let asset: AVURLAsset = AVURLAsset(url: contentUrl!, options:
["AVURLAssetHTTPHeaderFieldsKey": headers])
let playerItem: AVPlayerItem = AVPlayerItem(asset: asset)
self.avPlayer?.replaceCurrentItem(with: playerItem)
self.avPlayer?.play()
When i try to stream Airplay the content not displaying. Airplay is not working with an encrypted url. How can i stream? Is there any way.
We are facing a weird behaviour when implementing the AirPlay functionality of our iOS app.
When we test our app on Apple TV devices everything works fine. On some smart TVs with a specific AirPlay receiver version, (more details below) the stream gets stuck on buffering state immediately after switching to AirPlay mode. On other smart TVs, with different AirPlay receiver version, everything works as expected.
The interesting part is that other free or DRM protected streams, work fine on all devices.
Smart TVs that AirPlay works fine
AirPlay Version -> 25.06 (19.9.9)
Smart TVs that AirPlay stuck at buffering state:
AirPlayReceiverSDKVersion -> 3.3.0.54
AirPlayReceiverAppVersion -> 53.122.0
You can reproduce this issue using the following stream url:
https://tr.vod.cdn.cosmotetvott.gr/v1/310/668/1674288197219/1674288197219.ism/.m3u8?qual=a&ios=1&hdnts=st=1713194669\~exp=1713237899\~acl=\*/310/668/1674288197219/1674288197219.ism/\*\~id=cab757e3-9922-48a5-988b-3a4f5da368b6\~data=de9bbd0100a8926c0311b7dbe5389f7d91e94a199d73b6dc75ea46a4579769d7~hmac=77b648539b8f3a823a7d398d69e5dc7060632c29
If this link expires, notify me to send a new one for testing.
Could you please provide to us any specific suggestion as to what causes this issue on those specific streams?
Hello everyone, has anyone already managed to successfully deploy an "Airplay discovery broker"?
LG documented the process via a simple diagram and referred me to the airplay APIs without further details...
If anyone has already used this type of architecture, I would be happy to see an example!
Hi,
Subtitles are not rendered when streamed via airplay. Can you let us know what's the expectation here to render subtitle when contents are streamed via airplay.
I want to present content from my iOS app to a display (ie SmartTV) via AirPlay. I've searched the Apple doc and have done Google searches, but I can't find any decent examples of how to get started with connecting to a remote display. Everything seems to be either out of date or too small a snippet to be of use (no context as to where the snippet might go in your code).
Can someone please show me how to connect my SwiftUI app to a remote display? It would be greatly appreciated.
I have an AVPlayerViewController in my app to play custom audio+image or video streamed from an online service. If I set the below, I am able to add info to the nowPlayingInfo dictionary
avplayerController.updatesNowPlayingInfoCenter = false
This works for iOS control centre, it correctly displays my custom album name, artwork etc. But when using airplay for audio, it only displays the track name while playing audio. It doesn't display the artwork or the album etc. However if I set
avplayerController.player?.allowsExternalPlayback = false
It does correctly display artwork, title, album etc. This however disables the airplay button thats inbuilt on the player. I would like this button to remain, but need the artwork to be displayed while airplay-ing. How to I achieve this?
In iOS 17 Beta, a new AVSampleBufferVideoRenderer class has been added:
https://developer.apple.com/documentation/avfoundation/avsamplebuffervideorenderer
I'm wondering if this could somehow be used together with AirPlay in order to manually enqueue video sample buffers, just like you already can for AirPlay Audio with AVSampleBufferAudioRenderer (see: https://developer.apple.com/documentation/avfaudio/audio_engine/playing_custom_audio_with_your_own_player).
I want to be able to stream AirPlay Video without HLS.
If I try to add the video renderer to their existing sample project for audio I get exception with message: "... video target must be added to the AVSampleBufferVideoRenderer prior to enqueueing sample buffers.", which I guess makes sense. But since there is no documentation on this yet, I can't know how to add a video target, nor what kind of video targets are supported.
I'm using the following:
mDNSResponder 1790.80.10
Bonjour Conformance Test (BCT) 1.5.2
Linux 6.1.y kernel
I'm testing an Airplay 2 speaker as part of our self certification.
When BCT gets into the mDNS tests mDNSResponder fails the subsequent conflict test with this message:
ERROR 2023-06-12 10:37:29.398711-0500 _sub_conflict 03570: Device did not complete its probing sequence for a new name after a subsequent conflict arose for its previously acquired name.
BCT then retries three times with each retry failing with the same message.
Am I missing something from my software that interacts with the mdns daemon? Is this a known issue with the posix build for mDNSResponder? What can I do to get this test to pass?
Any help would be appreciated.
Ethan
During the video playback with the AVPlayer, our iOS and AppleTV apps set AVPlayerItem externalMetadata. Additionaly, the iOS app also sets the default MPNowPlayingInfoCenter nowPlayingInfo. The apps also register to MPRemoteCommands.
What works: The iOS NowPlayingInfo center is properly filled with the metadata, the remote commands work well and on AppleTV, using the AVPlayerViewController, the info tab shows the content metadata.
Problem: when AirPlaying the content to an AppleTV, the stream is properly played but no metadata are displayed on the AppleTV.
I've tried to set nowPlayingInfo on the avPlayerItem or using an MPNowPlayingSession (new with iOS/tvOS 16) but with no luck.
Can someone help me display the playerItem metadata on the AirPlay device?
When playing an HLS/FairPlay VOD content, the playback position sometimes jumps to 0 during the transition between local and external playback (and the other way around). The app does nothing during the transition, apart from responding to ContentKeySession key requests.
It is not systematic but occurs quite often. When the issue occurs, the playback sometimes resumes to the position where it was before the transition but usually it does not.
Most of the time, when the issue occur, the iOS app periodic time observer gets triggered and the AVPlayerItem currentTime() has the specific value 0.001s.
Here is an extract of our iOS app logs with the value of AVPlayerItem.currentTime() when the AVPlayer PeriodicTimeObserver is triggered:
📘 [8:21:54.520] Did update playback time: 4512.001719539
📘 [8:21:55.472] Did update playback time: 4512.958677777778
📗 [8:21:55.497] Player external playback active changed to: true
📘 [8:21:57.674] Did update playback time: 4512.001897709
📘 [8:21:57.779] Did update playback time: 4511.974062125
📘 [8:21:57.800] Did update playback time: 4511.995523418
📘 [8:21:57.805] Did update playback time: 4512.001181626
📘 [8:21:58.806] Did update playback time: 4513.001841876
📘 [8:21:59.794] Did update playback time: 4514.001132625
📘 [8:22:00.795] Did update playback time: 4515.001653707
📘 [8:22:01.562] Did update playback time: 4515.766148708
📗 [8:22:01.679] Player external playback active changed to: false
📘 [8:22:01.683] Did update playback time: 0.001
📘 [8:22:01.700] Did update playback time: 4510.0
📘 [8:22:01.737] Did update playback time: 4510.0
📘 [8:22:01.988] Did update playback time: 4509.956132376
📘 [8:22:01.990] Did update playback time: 4509.958216834
📘 [8:22:03.033] Did update playback time: 4511.0015079
📘 [8:22:04.033] Did update playback time: 4512.001688753
📘 [8:22:05.033] Did update playback time: 4513.001998495
📘 [8:22:06.033] Did update playback time: 4514.001205557
📘 [8:22:06.045] Did update playback time: 4514.0325555555555
📗 [8:22:06.080] Player external playback active changed to: true
📘 [8:22:06.800] Did update playback time: 0.0
📘 [8:22:06.814] Did update playback time: 0.0
📘 [8:22:08.168] Did update playback time: 0.002258708
📘 [8:22:08.218] Did update playback time: -0.075460416
📘 [8:22:08.237] Did update playback time: -0.063310916
📘 [8:22:09.298] Did update playback time: 1.001932292
📘 [8:22:10.295] Did update playback time: 2.003054584
📘 [8:22:11.302] Did update playback time: 3.001831125
📘 [8:22:12.301] Did update playback time: 4.001488001
Local -> AirPlay: no issue
AirPlay -> Local: the issue occurs temporarily and the playback approximately returns to its position before the transition
Local -> AirPlay: the issue occurs permanently and the playback continues from the beginning of the stream.
I've filed a feedback for this issue with both iOS device and AppleTV sysdiagnoses: https://feedbackassistant.apple.com/feedback/11990309
I'm developing a media player for Mac (AppKit, not Catalyst) that plays local and remote content. I have AirPlay working with AVPlayer (with an AVRoutePickerView assigning the route), but while I get the metadata that I've set for the MPNowPlayingInfoCenter on the AirPlay device (a TV in this case), I don't get any album art (but I do in the macOS now playing menu bar/control centre applet). It looks like this: (imgur link because I can't get it to upload in the forum): https://i.imgur.com/2JBIYCw.jpg
My code for setting the metadata:
NSImage *artwork = [currentTrack coverImage];
CGSize artworkSize = [artwork size];
MPMediaItemArtwork *mpArtwork = [[MPMediaItemArtwork alloc] initWithBoundsSize:artworkSize requestHandler:^NSImage * _Nonnull(CGSize size) {
return artwork;
}];
[songInfo setObject: mpArtwork forKey:MPMediaItemPropertyArtwork];
I noticed that it doesn't resize, but it seems at least macOS doesn't care. I tried modifying the code to resize the artwork in the callback, but that also doesn't change anything.
I noticed in the logs that I get a message about a missing entitlement:
2023-01-29 14:00:37.889346-0400 Submariner[42682:9794531] [Entitlements] MSVEntitlementUtilities - Process Submariner PID[42682] - Group: (null) - Entitlement: com.apple.mediaremote.external-artwork-validation - Entitled: NO - Error: (null)
...however, this seems to be a private entitlement and the only reference I can find to it is WebKit. Using it makes LaunchServices very angry at me, and I presume it's a red herring.
Hello, I set a AVRoutePickerView in app, when I press the routePickerView, it shows nothing, I don't know what's wrong with my project. By the way, when I press MPVolumeView ,it can shows the system route alert picker. Anyone can help?