We have a universal iOS/tvOS app that also supports iOS App on Mac.
In our AVPlayer-based video player we support AirPlay with AVRouteDetector and AVRoutePickerView. We play HLS streams.
When we try to AirPlay from an iOS device to an Apple TV or a Mac that has our app installed, it doesn't work. The receiver is marked as active in the route picker UI but the video doesn't show up on the receiver and playback stops.
When our app isn't installed on the receiver device, everything works as expected.
Has anyone encountered the same issue? Any solutions available for this?
AirPlay 2
RSS for tagAirPlay 2 allows users to wirelessly send content from any Apple device to a device enabled with AirPlay.
Posts under AirPlay 2 tag
13 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Context: I’m not an app developer, but I’m doing some research in order to gain a high level understanding of an app that I want some developers to build for me.
Basically I need a navigation app built (integrated with Google Maps) that works pretty much like Google Maps. This app will connect to and stream live navigation data to a car HUD (heads-up-display) device using WiFi direct (to facilitate high bandwidth streaming). The purpose of the streaming from the mobile app to the HUD is so that the driver can see the live map without having to look at their phone.
This leads me to my QUESTION: this functionality (streaming from app to HUD) is similar to what AirPlay does & I’ve read that Apple rejects apps that replicate AirPlay’s screen mirroring function. I’ve also read that in order to work around this, my app should limit the information that is sent to & displayed by the HUD device (basically, shouldn’t mirror the whole screen). So, would Apple still reject my app if it only streamed the live map onto the HUD device & left out all the other information displayed on the app (ETA, turn signals, distances etc.) and thus refraining from streaming the entire screen?
Context: I’m not an app developer, but I’m doing some research in order to gain a high level understanding of an app that I want some developers to build for me.
Basically I need a navigation app built (integrated with Google Maps) that works pretty much like Google Maps. This app will connect to and stream live navigation data to a car HUD (heads-up-display) device using WiFi direct (to facilitate high bandwidth streaming). The purpose of the streaming from the mobile app to the HUD is so that the driver can see the live map without having to look at their phone.
This leads me to my QUESTION: this functionality (streaming from app to HUD) is similar to what AirPlay does & I’ve read that Apple rejects apps that replicate AirPlay’s screen mirroring function. I’ve also read that in order to work around this, my app should limit the information that is sent to & displayed by the HUD device (basically, shouldn’t mirror the whole screen). So, would Apple still reject my app if it only streamed the live map onto the HUD device & left out all the other information displayed on the app (ETA, turn signals, distances etc.), and thus refraining from streaming the entire screen?
Description:
HLS-VOD-Stream contains several audio tracks, being marked with same language tag but different name tag.
https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8
e.g.
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="bipbop_audio",LANGUAGE="eng",NAME="BipBop Audio 1",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="bipbop_audio",LANGUAGE="eng",NAME="BipBop Audio 2",AUTOSELECT=NO,DEFAULT=NO,URI="alternate_audio_aac/prog_index.m3u8"
You set up Airplay from e.g. iPhone or Mac or iPad to Apple TV or Mac.
Expected behavior:
You see in AVPlayer and QuickTime Language Audiotrack Dropdown containing info about LANGUAGE and NAME on Airplay Sender as on Airplay Receiver - the User Interface between playing back a local Stream or Airplay-Stream is consistent.
Current status:
You see in UI of Player of Airplay Receiver only Information of Language Tag.
Question:
=> Do you have an idea, if this is a missing feature of Airplay itself or a bug?
Background:
We'd like to offer additional Audiotrack with enhanced Audio-Characteristics for better understanding of spoken words - "Klare Sprache".
Technically, "Klare Sprache" works by using an AI-based algorithm that separates speech from other audio elements in the broadcast. This algorithm enhances the clarity of the dialogue by amplifying the speech and diminishing the volume of background sounds like music or environmental noise. The technology was introduced by ARD and ZDF in Germany and is available on select programs, primarily via HD broadcasts and digital platforms like HbbTV.
Users can enable this feature directly from their television's audio settings, where it may be labeled as "deu (qks)" or "Klare Sprache" depending on the device. The feature is available on a growing number of channels and is part of a broader effort to make television more accessible to viewers with hearing difficulties.
It can be correctly signaled in HLS via:
e.g.
https://ccavmedia-amd.akamaized.net/test/bento4multicodec/airplay1.m3u8
# Audio
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="stereo-aac",LANGUAGE="de",NAME="Deutsch",DEFAULT=YES,AUTOSELECT=YES,CHANNELS="2",URI="ST.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="stereo-aac",LANGUAGE="de",NAME="Deutsch (Klare Sprache)",DEFAULT=NO,AUTOSELECT=YES,CHARACTERISTICS="public.accessibility.enhances-speech-intelligibility",CHANNELS="2",URI="KS.m3u8"
Still there's the problem, that with Airplay-Stream you don't get this extra information but only LANGUAGE tag.
Hello,
We have a TV app, based on react-native-video, which was tweaked to suit our requirements.
There is a problem with AirPlay streaming.
An asset can be streamed on AppleTV, but when we try to stream it on any TV with AirPlay and choose a language different from the default in the manifest there is a problem.
Seek freezes the picture and nothing happens. The funny thing is if we do seek back to the starting point +/-20 sec, the video resumes.
The obvious difference with AppleTV, which we were able to recognize, is that with AppleTv search an isPlaybackBufferEmpty is observed, while with 3rd party TVs, there are only isPlaybackLikelyToKeepUp events firing.
Maybe, there is a solution to that issue? Or at least, there is a way to forcefully empty the buffer when search is called?
Thank you
The Photos app on iOS is capable of playing Live Photos and showing images on another screen. My SwiftUI app is able to AirPlay videos and I can make the button appear or not using AVPlayer's isExternalPlaybackActive boolean. However, that boolean is not available for images and Live Photos.
I've tried using an AVRoutePickerView(), which gives me a button for the user to start/stop AirPlay, but I have not found a way to associate it with SwiftUI's Image or my LivePhotoView:
struct LivePhotoView: UIViewRepresentable {
var livephoto: PHLivePhoto
@Binding var playback:Bool
@AppStorage("playAudio") var playAudio = userDefaults.bool(forKey: "playAudio")
func makeUIView(context: Context) -> PHLivePhotoView {
let view = PHLivePhotoView()
try! AVAudioSession.sharedInstance().setCategory({playAudio ? .playback : .soloAmbient}())
view.contentMode = .scaleAspectFit
return view
}
func updateUIView(_ lpView: PHLivePhotoView, context: Context) {
lpView.livePhoto = livephoto
if playback {
lpView.isMuted = false
try! AVAudioSession.sharedInstance().setCategory({playAudio ? .playback : .soloAmbient}())
try! AVAudioSession.sharedInstance().setActive(true)
lpView.isMuted = false
lpView.startPlayback(with: .full)
} else {
lpView.stopPlayback()
}
}
}
struct RouteButtonView: UIViewRepresentable {
func makeUIView(context: Context) -> AVRoutePickerView {
let routePickerView = AVRoutePickerView()
routePickerView.tintColor = .gray
routePickerView.prioritizesVideoDevices = true
return routePickerView
}
func updateUIView(_ uiView: AVRoutePickerView, context: Context) {
// No update needed
}
}
Am I missing something? If it's a system internal API only available to the Photos App, why?
Mike
I'm trying to make screen mirroring work with my own DNS by creating the static PTR SRV and TXT records.
I create all the necesary
b._dns-sd._udp lb._dns-sd._udp _services._dns-sd._udp and coresponding _airplay._tcp _raop._tcp records.
That seems to work as I can see the records in dns-sd -Z _airplay._tcp and dns-sd -Z _roap._tcp or in the Discovery Browser app
But for some sort of reason I cannot seem to see my displays in the Screen Mirroring menu
If anyone has any suggestions, I am all ears.
Hi Guys, I would like to ask if anyone knows the FPS of screen recording and airplay on Vision Pro. Airplay refers to mirroring the Vision Pro view to MacBook/iPhone/iPad. Also, is there any way to record the screen with the raw FPS of Vision Pro (i.e., 90)?
Hi,
I am using avplayer, want to stream airplay with my encrypted hls m3u8 url. I am sharing my code snippet.
` let contentUrl = URL(string: String(format:videoUrl))
let headers = ["token": token]
let asset: AVURLAsset = AVURLAsset(url: contentUrl!, options["AVURLAssetHTTPHeaderFieldsKey": headers])
let playerItem: AVPlayerItem = AVPlayerItem(asset: asset)
self.avPlayer?.replaceCurrentItem(with: playerItem)
self.avPlayer?.play()`
Airplay is not working on my tv when i start stream. My encrypted url won't work.
Is there any way to stream airplay with encrypted url. Stuck here........
I have an avplayer with an encrypted m3u8 url and this is my code snippet
let contentUrl = URL(string: String(format:videoUrl))
let headers = ["token": token]
let asset: AVURLAsset = AVURLAsset(url: contentUrl!, options:
["AVURLAssetHTTPHeaderFieldsKey": headers])
let playerItem: AVPlayerItem = AVPlayerItem(asset: asset)
self.avPlayer?.replaceCurrentItem(with: playerItem)
self.avPlayer?.play()
When i try to stream Airplay the content not displaying. Airplay is not working with an encrypted url. How can i stream? Is there any way.
We are facing a weird behaviour when implementing the AirPlay functionality of our iOS app.
When we test our app on Apple TV devices everything works fine. On some smart TVs with a specific AirPlay receiver version, (more details below) the stream gets stuck on buffering state immediately after switching to AirPlay mode. On other smart TVs, with different AirPlay receiver version, everything works as expected.
The interesting part is that other free or DRM protected streams, work fine on all devices.
Smart TVs that AirPlay works fine
AirPlay Version -> 25.06 (19.9.9)
Smart TVs that AirPlay stuck at buffering state:
AirPlayReceiverSDKVersion -> 3.3.0.54
AirPlayReceiverAppVersion -> 53.122.0
You can reproduce this issue using the following stream url:
https://tr.vod.cdn.cosmotetvott.gr/v1/310/668/1674288197219/1674288197219.ism/.m3u8?qual=a&ios=1&hdnts=st=1713194669\~exp=1713237899\~acl=\*/310/668/1674288197219/1674288197219.ism/\*\~id=cab757e3-9922-48a5-988b-3a4f5da368b6\~data=de9bbd0100a8926c0311b7dbe5389f7d91e94a199d73b6dc75ea46a4579769d7~hmac=77b648539b8f3a823a7d398d69e5dc7060632c29
If this link expires, notify me to send a new one for testing.
Could you please provide to us any specific suggestion as to what causes this issue on those specific streams?
Hello everyone, has anyone already managed to successfully deploy an "Airplay discovery broker"?
LG documented the process via a simple diagram and referred me to the airplay APIs without further details...
If anyone has already used this type of architecture, I would be happy to see an example!
Hi,
Subtitles are not rendered when streamed via airplay. Can you let us know what's the expectation here to render subtitle when contents are streamed via airplay.