I have an audio app that can play audio on an AirPlay device.
On non-Apple TV devices, the AirPlay app (on Roku, Samsung, etc.) shows the now playing metadata: title, artist, and album art.
However, on tvOS 18.1, no metadata is shown. The Apple TV device plays the audio, but there is no now playing information shown, nor any other indicators.
Other media apps show the "Now Playing" controls on the upper right of the tvOS home screen.
Can someone point me in the direction of how to solve this issue? I think I am missing something somewhere in regards to the tvOS metadata implementation.
AirPlay
RSS for tagAirPlay allows users to wirelessly stream content from their iOS device or Mac to devices and accessories compatible with AirPlay.
Posts under AirPlay tag
21 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
We have a universal iOS/tvOS app that also supports iOS App on Mac.
In our AVPlayer-based video player we support AirPlay with AVRouteDetector and AVRoutePickerView. We play HLS streams.
When we try to AirPlay from an iOS device to an Apple TV or a Mac that has our app installed, it doesn't work. The receiver is marked as active in the route picker UI but the video doesn't show up on the receiver and playback stops.
When our app isn't installed on the receiver device, everything works as expected.
Has anyone encountered the same issue? Any solutions available for this?
Context: I’m not an app developer, but I’m doing some research in order to gain a high level understanding of an app that I want some developers to build for me.
Basically I need a navigation app built (integrated with Google Maps) that works pretty much like Google Maps. This app will connect to and stream live navigation data to a car HUD (heads-up-display) device using WiFi direct (to facilitate high bandwidth streaming). The purpose of the streaming from the mobile app to the HUD is so that the driver can see the live map without having to look at their phone.
This leads me to my QUESTION: this functionality (streaming from app to HUD) is similar to what AirPlay does & I’ve read that Apple rejects apps that replicate AirPlay’s screen mirroring function. I’ve also read that in order to work around this, my app should limit the information that is sent to & displayed by the HUD device (basically, shouldn’t mirror the whole screen). So, would Apple still reject my app if it only streamed the live map onto the HUD device & left out all the other information displayed on the app (ETA, turn signals, distances etc.) and thus refraining from streaming the entire screen?
Context: I’m not an app developer, but I’m doing some research in order to gain a high level understanding of an app that I want some developers to build for me.
Basically I need a navigation app built (integrated with Google Maps) that works pretty much like Google Maps. This app will connect to and stream live navigation data to a car HUD (heads-up-display) device using WiFi direct (to facilitate high bandwidth streaming). The purpose of the streaming from the mobile app to the HUD is so that the driver can see the live map without having to look at their phone.
This leads me to my QUESTION: this functionality (streaming from app to HUD) is similar to what AirPlay does & I’ve read that Apple rejects apps that replicate AirPlay’s screen mirroring function. I’ve also read that in order to work around this, my app should limit the information that is sent to & displayed by the HUD device (basically, shouldn’t mirror the whole screen). So, would Apple still reject my app if it only streamed the live map onto the HUD device & left out all the other information displayed on the app (ETA, turn signals, distances etc.), and thus refraining from streaming the entire screen?
I cannot mirror or extend my screen from mac mini m2 to iPad 10 gen. Whenever I click on "mirror or extend screen" my external display for mac refreshes after showing "no signal" and comes back on meanwhile my iPad locks out and screen mirror or extending is unsuccessful. But I can mirror my iPad screen to mac mini m2. Earlier everything was working, suddenly it is not working
Can’t start fitness video with my iPhone on my tv since new update 18.1. Air play is connecting I see the countdown then the dots rolling but nothing happens and I end up receiving a « can’t read video » message.
I’m using an iphone 16 pro max, an apple watch ultra (1st gen) and an apple tv HD (A1625 model).
I have almost the same issue with a more recent Apple TV, but on that one it’s just painfully long before the video starts but it does start eventually (just taking much longer than it used to). That some tv is an A2843 model (apple tv 4k 3rd gen).
Has anyone else had this issue?
Description:
HLS-VOD-Stream contains several audio tracks, being marked with same language tag but different name tag.
https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8
e.g.
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="bipbop_audio",LANGUAGE="eng",NAME="BipBop Audio 1",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="bipbop_audio",LANGUAGE="eng",NAME="BipBop Audio 2",AUTOSELECT=NO,DEFAULT=NO,URI="alternate_audio_aac/prog_index.m3u8"
You set up Airplay from e.g. iPhone or Mac or iPad to Apple TV or Mac.
Expected behavior:
You see in AVPlayer and QuickTime Language Audiotrack Dropdown containing info about LANGUAGE and NAME on Airplay Sender as on Airplay Receiver - the User Interface between playing back a local Stream or Airplay-Stream is consistent.
Current status:
You see in UI of Player of Airplay Receiver only Information of Language Tag.
Question:
=> Do you have an idea, if this is a missing feature of Airplay itself or a bug?
Background:
We'd like to offer additional Audiotrack with enhanced Audio-Characteristics for better understanding of spoken words - "Klare Sprache".
Technically, "Klare Sprache" works by using an AI-based algorithm that separates speech from other audio elements in the broadcast. This algorithm enhances the clarity of the dialogue by amplifying the speech and diminishing the volume of background sounds like music or environmental noise. The technology was introduced by ARD and ZDF in Germany and is available on select programs, primarily via HD broadcasts and digital platforms like HbbTV.
Users can enable this feature directly from their television's audio settings, where it may be labeled as "deu (qks)" or "Klare Sprache" depending on the device. The feature is available on a growing number of channels and is part of a broader effort to make television more accessible to viewers with hearing difficulties.
It can be correctly signaled in HLS via:
e.g.
https://ccavmedia-amd.akamaized.net/test/bento4multicodec/airplay1.m3u8
# Audio
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="stereo-aac",LANGUAGE="de",NAME="Deutsch",DEFAULT=YES,AUTOSELECT=YES,CHANNELS="2",URI="ST.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="stereo-aac",LANGUAGE="de",NAME="Deutsch (Klare Sprache)",DEFAULT=NO,AUTOSELECT=YES,CHARACTERISTICS="public.accessibility.enhances-speech-intelligibility",CHANNELS="2",URI="KS.m3u8"
Still there's the problem, that with Airplay-Stream you don't get this extra information but only LANGUAGE tag.
Hello,
We have a TV app, based on react-native-video, which was tweaked to suit our requirements.
There is a problem with AirPlay streaming.
An asset can be streamed on AppleTV, but when we try to stream it on any TV with AirPlay and choose a language different from the default in the manifest there is a problem.
Seek freezes the picture and nothing happens. The funny thing is if we do seek back to the starting point +/-20 sec, the video resumes.
The obvious difference with AppleTV, which we were able to recognize, is that with AppleTv search an isPlaybackBufferEmpty is observed, while with 3rd party TVs, there are only isPlaybackLikelyToKeepUp events firing.
Maybe, there is a solution to that issue? Or at least, there is a way to forcefully empty the buffer when search is called?
Thank you
Dear All,
Since installing iOS 18 public beta, I can't send music from my iPhone to my old Airport Express Gen 1 (unable to conect). Is this a general problem?
Thanks for your feedback,
Patrick
I'm trying to make screen mirroring work with my own DNS by creating the static PTR SRV and TXT records.
I create all the necesary
b._dns-sd._udp lb._dns-sd._udp _services._dns-sd._udp and coresponding _airplay._tcp _raop._tcp records.
That seems to work as I can see the records in dns-sd -Z _airplay._tcp and dns-sd -Z _roap._tcp or in the Discovery Browser app
But for some sort of reason I cannot seem to see my displays in the Screen Mirroring menu
If anyone has any suggestions, I am all ears.
Hi Guys, I would like to ask if anyone knows the FPS of screen recording and airplay on Vision Pro. Airplay refers to mirroring the Vision Pro view to MacBook/iPhone/iPad. Also, is there any way to record the screen with the raw FPS of Vision Pro (i.e., 90)?
Hi,
I am using avplayer, want to stream airplay with my encrypted hls m3u8 url. I am sharing my code snippet.
` let contentUrl = URL(string: String(format:videoUrl))
let headers = ["token": token]
let asset: AVURLAsset = AVURLAsset(url: contentUrl!, options["AVURLAssetHTTPHeaderFieldsKey": headers])
let playerItem: AVPlayerItem = AVPlayerItem(asset: asset)
self.avPlayer?.replaceCurrentItem(with: playerItem)
self.avPlayer?.play()`
Airplay is not working on my tv when i start stream. My encrypted url won't work.
Is there any way to stream airplay with encrypted url. Stuck here........
I have an avplayer with an encrypted m3u8 url and this is my code snippet
let contentUrl = URL(string: String(format:videoUrl))
let headers = ["token": token]
let asset: AVURLAsset = AVURLAsset(url: contentUrl!, options:
["AVURLAssetHTTPHeaderFieldsKey": headers])
let playerItem: AVPlayerItem = AVPlayerItem(asset: asset)
self.avPlayer?.replaceCurrentItem(with: playerItem)
self.avPlayer?.play()
When i try to stream Airplay the content not displaying. Airplay is not working with an encrypted url. How can i stream? Is there any way.
The device listing core-audio API kAudioHardwarePropertyDevices does not list Airplay device if virtual-audio-driver is selected as sound output device in System settings. This virtual audio driver is developed by us and is named as BoomAudio.
We need to select BoomAudio in System Settings Sound output so that we can get system audio and apply Boom effects/enhancement. But whenever BoomAudio is selected as sound output, we cannot get Airplay device in device-list API and hence cannot play-through to AirPlay sound output device.
Steps:
Select BoomAudio as Sound output in System Settings. (The issue occurs if any other sound output device like Headphone/Internal Speakers is selected)
If AppleTV is connected then we should not AirPlay the system-display. Only Sound output of System should be Airplayed.
Build and run the sample project that we have attached “SampleAirplayAudio
Click on the button “Sound Output Device List”.
Output: In the console of Xcode, Airplay device does not get listed.
BoomAudio can be installed from the following path:
https://d3jbf8nvvpx3fh.cloudfront.net/gdassets/airplaydts/Boom+2+Installer.zip
The sample project 'SampleAirplayAudio' is available at this path:
https://d3jbf8nvvpx3fh.cloudfront.net/gdassets/airplaydts/SampleAirplayAudio.zip
We have already raised Bug report at Feedback Assistant Apple and the bug id is:
FB7543204
For whoever needs to hear this...
Say you have an AVURLAsset:
let asset = AVURLAsset(url: URL(string: "https://www.example.com/playlist.m3u8")!)
Then say you load that asset into an AVPlayerItem, and would like it to automatically load certain asset keys you're interested in ahead of time:
let playerItem = AVPlayerItem(
asset: avURLAsset,
automaticallyLoadedAssetKeys: [
"metadata",
"commonMetadata",
"availableMetadataFormats",
"allMediaSelections",
"hasProtectedContent",
"overallDurationHint"])
Among those keys, do not use "tracks" even though it's one of the available options. That will break AirPlay across all platforms (the user chooses an AirPlay destination and the AVPlayerItem's status instantly switches to failed).
Took me far too long to track this down, just wanted to get it out there to save anybody else some time if they ever run into it.
We are facing a weird behaviour when implementing the AirPlay functionality of our iOS app.
When we test our app on Apple TV devices everything works fine. On some smart TVs with a specific AirPlay receiver version, (more details below) the stream gets stuck on buffering state immediately after switching to AirPlay mode. On other smart TVs, with different AirPlay receiver version, everything works as expected.
The interesting part is that other free or DRM protected streams, work fine on all devices.
Smart TVs that AirPlay works fine
AirPlay Version -> 25.06 (19.9.9)
Smart TVs that AirPlay stuck at buffering state:
AirPlayReceiverSDKVersion -> 3.3.0.54
AirPlayReceiverAppVersion -> 53.122.0
You can reproduce this issue using the following stream url:
https://tr.vod.cdn.cosmotetvott.gr/v1/310/668/1674288197219/1674288197219.ism/.m3u8?qual=a&ios=1&hdnts=st=1713194669\~exp=1713237899\~acl=\*/310/668/1674288197219/1674288197219.ism/\*\~id=cab757e3-9922-48a5-988b-3a4f5da368b6\~data=de9bbd0100a8926c0311b7dbe5389f7d91e94a199d73b6dc75ea46a4579769d7~hmac=77b648539b8f3a823a7d398d69e5dc7060632c29
If this link expires, notify me to send a new one for testing.
Could you please provide to us any specific suggestion as to what causes this issue on those specific streams?
Hello everyone, has anyone already managed to successfully deploy an "Airplay discovery broker"?
LG documented the process via a simple diagram and referred me to the airplay APIs without further details...
If anyone has already used this type of architecture, I would be happy to see an example!
Hi,
Subtitles are not rendered when streamed via airplay. Can you let us know what's the expectation here to render subtitle when contents are streamed via airplay.
I’m using the new ApplicationMusicPlayer support on macOS 14 and playing items from my Apple Music library. I wanted to play this music from my app to an AirPlay destination so i added an AVRoutePickerView. However, selecting any destination via this view doesn’t make a difference to the playback. It continues to play on my mac speakers no matter which airplay destination i choose.
Also submitted as FB13521393.
On an Apple TV, if you have an app that triggers a video playing in AVPlayer while another device is air playing to the Apple TV. The device gets disconnected from Airplay and the video plays instead. There doesn't seem to be a way to sense that Airplay is happening or not happening from the Apple TV side programmatically or a way to prevent the behavior. Is this intentional? It would seem to make more sense to have Airplay take priority or push the app to the background.