I'm developing a tutorial style tvOS app with multiple videos. The examples I've seen so far deal with only one video.
Defining the player and source(url) before body view
let avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../video.mp4")!))
and then in the body view the video is displayed
VideoPlayer(player: avPlayer)
This allows options such as stop/start etc.
When I try something similar with a video title passed into this view I can't define the player with this title variable.
var vTitle: String
var avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../" + vTitle + ".mp4"")!))
var body: some View {
I het an error that vTitle can't be used in the url above the body view.
Any thoughts or suggestions? Thanks
Posts under tvOS tag
86 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I have a TVML style app on the app store that no longer seems to work. I'm working on converting it to SwiftUI after seeing the WWDC video "Migrate your TVML app to SwiftUI".
I've got most of the code working up until I'm trying to display video from a remote source (my website). It looks like the network connection is blocked, maybe.
On a macOS app I see a App Sandbox capabilities that include Network access. I don't see that option for the tvOS app. Am I missing something or is it not needed, and I should look elsewhere?
Thanks, David
Hi, so a little context, by environment variables I mean like $HOME in linux. I know that there are some standard and some app specific environment variables in the above mentioned platforms.
Is is possible for the user to set environment variables? If so, what are the ways in which users can set environment variables in the platforms mentioned across the system or for the app as they would in macOS/linux/windows?
And is it possible for developers of the app do the same? If so how?
What I have found so far is that it cannot be done by users, and there is one place in xcode where I, as a developer can set environment variables for my app from the scheme. This isn't available when I ship just the installation binary to the end user, but rather is available only when the app is run using xcode. I just need validation that what I understand is correct and that there aren't other ways to do this by the user/developer without jailbreaking.
I've seen the Multiview feature on tvOS that displays a small grid icon when available. However, I only see this functionality in VisionOS using the AVMultiviewManager. Does a different name refer to this feature on tvOS?
Relevant Links:
https://www.reddit.com/r/appletv/comments/12opy5f/handson_with_the_new_multiview_split_screen/
https://www.pocket-lint.com/how-to-use-multiview-apple-tv/#:~:text=You'll%20see%20a%20grid,running%20at%20the%20same%20time.
When you launch Xcode and then open Devices and Simulators, and connect your Apple TV 4K, you have a new menu item in Settings named Developer. If you close Xcode, the item stays, but if you restart the Apple TV 4K, it's gone until the next time you open Xcode and pair it again.
Is there a way to leave it there permanently? I'm not a developer, but it's still useful to me because it has the playback HUD with lots of information about the codecs, streaming bitrate and so on, and since I'm an A/V nerd, that is quite useful to me.
Hi guys,
I'm trying to add accessibility labels to a static text and custom SwiftUI views. Example:
MyView {
...
}
//.accessibilityElement()
.accessibilityElement(children: .combine)
//.accessibilityRemoveTraits(.isStaticText)
//.accessibilityAddTraits(.isButton)
.accessibilityLabel("ACCESSIBILITY LABEL")
.accessibilityHint("ACCESSIBILITY HINT")
When using 'voiceover' or 'hover text' accessibility features, focus moves only between active elements and not on static elements.
When I add .focusable() it works, but I don't want to make those elements focusable when all accessibility features are off.
I suppose I could do something like this:
.focusable(UIApplication.shared.accessibility.voiceOver.isOn || UIApplication.shared.accessibility.hoverText.isOn)
Note: this is just pseudocode, because I don't remember exactly how to detect current accessibility settings.
However using focusable() with conditions on hundreds of static texts in an app seems to be overkill. Also the accessibility focus is needed on some control containers where we already have a little more complex handling of focus with conditions in focusable(...) on parent and child elements, so extending it for accesssiblity seems to be too complicated.
Is there a simple way to tell accessiblity that an element is focusable specifically for 'hover text' and for 'voiceover'?
Example what I want to accomplish for TV content:
VStack
{
HStack {
Text(Terminator)
if parentalLock {
Image(named: .lock)
{
}
.accessibilityLabel(for: hover, "Terminator - parental lock")
Text("Sci-Fi * 8pm - 10pm * Remaining 40 min. * Live")
.accessibilityLabel(for: hover, "Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live")
}
.accessibilityLabel(for: voiceover, "Terminator, Sci-Fi, 8 to 10pm, Remaining 40 min. Broadcasting Live, parental lock")```
I saw all Accessibility WWDC videos 2016, 2022, 2024 and googling it for several hours, but I coudln't find any solution for static texts and custom views. From those videos it appears .accessibilityLabel() should be enough, but it clearly works only on actvie elements and does not work for other SwiftUI views on tvOS without focusable().
Can this be done without using focusable() with conditions for detection which accessibility feature is on?
The problem with focusable would be that for accessibility I may need to read a text for parent view, but focus needs to be placed on a child element. I remember problems when focusable() is set on parent view that child was not focusable or something like that - simply put: complications in focus logic.
Thanks.
Our tvOS app makes use of top shelf Carousel style slides to promote our content.
We would like to detect when tvOS transitions between individual top shelf slides, regardless of whether the slide transition is made by a user (via the Siri remote), or by the system idle auto-transition.
Has anyone achieved this, maybe there are undocumented system hooks or events we can listen to?
I am encountering an issue when making an API call using URLSession with DispatchQueue.global(qos: .background).async on a real device running tvOS 18. The code works as expected on tvOS 17 and in the simulator for tvOS 18, but when I remove the debug mode, After the API call it takes few mintues or 5 to 10 min to load the data on the real device.
Code: Here’s the code I am using for the API call:
appconfig.getFeedURLData(feedUrl: feedUrl, timeOut: kRequestTimeOut, apiMethod: ApiMethod.POST.rawValue) { (result) in
self.EpisodeItems = Utilities.sharedInstance.getEpisodeArray(data: result)
}
func getFeedURLData(feedUrl: String, timeOut: Int, apiMethod: String, completion: @escaping (_ result: Data?) -> ()) {
guard let validUrl = URL(string: feedUrl) else { return }
var request = URLRequest(url: validUrl, cachePolicy: .useProtocolCachePolicy, timeoutInterval: TimeInterval(timeOut))
let userPasswordString = "\(KappSecret):\(KappPassword)"
let userPasswordData = userPasswordString.data(using: .utf8)
let base64EncodedCredential = userPasswordData!.base64EncodedString(options: .lineLength64Characters)
let authString = "Basic \(base64EncodedCredential)"
let headers = [
"authorization": authString,
"cache-control": "no-cache",
"user-agent": "TN-CTV-\(kPlateForm)-\(kAppVersion)"
]
request.addValue("application/json", forHTTPHeaderField: "Content-Type")
request.httpMethod = apiMethod
request.allHTTPHeaderFields = headers
let response = URLSession.requestSynchronousData(request as URLRequest)
if response.1 != nil {
do {
guard let parsedData = try JSONSerialization.jsonObject(with: response.1!, options: .mutableContainers) as? AnyObject else {
print("Error parsing data")
completion(nil)
return
}
print(parsedData)
completion(response.1)
return
} catch let error {
print("Error: \(error.localizedDescription)")
completion(response.1)
return
}
}
completion(response.1)
}
import Foundation
public extension URLSession {
public static func requestSynchronousData(_ request: URLRequest) -> (URLResponse?, Data?) {
var data: Data? = nil
var responseData: URLResponse? = nil
let semaphore = DispatchSemaphore(value: 0)
let task = URLSession.shared.dataTask(with: request) { taskData, response, error in
data = taskData
responseData = response
if data == nil, let error = error {
print(error)
}
semaphore.signal()
}
task.resume()
_ = semaphore.wait(timeout: .distantFuture)
return (responseData, data)
}
public static func requestSynchronousDataWithURLString(_ requestString: String) -> (URLResponse?, Data?) {
guard let url = URL(string: requestString.checkValidUrl()) else { return (nil, nil) }
let request = URLRequest(url: url)
return URLSession.requestSynchronousData(request)
}
}
Issue Description: Working scenario: The API call works fine on tvOS 17 and in the simulator for tvOS 18. Problem: When running on a real device with tvOS 18, the API call takes time[enter image description here] when debug mode is disabled, but works fine when debug mode is enabled, Data is loading after few minutes.
Error message: Error Domain=WKErrorDomain Code=11 "Timed out while loading attributed string content" UserInfo={NSLocalizedDescription=Timed out while loading attributed string content} NSURLConnection finished with error - code -1001 nw_read_request_report [C4] Receive failed with error "Socket is not connected" Snapshot request 0x30089b3c0 complete with error: <NSError: 0x3009373f0; domain: BSActionErrorDomain; code: 1 ("response-not-possible")> tcp_input [C7.1.1.1:3] flags=[R] seq=817957096, ack=0, win=0 state=CLOSE_WAIT rcv_nxt=817957096, snd_una=275546887
Environment: Xcode version: 16.1 Real device: Model A1625 (32GB) tvOS version: 18.1
Debugging steps I’ve taken: I’ve verified that the issue does not occur in debug mode. I’ve confirmed that the API call works fine on tvOS 17 and in the simulator (tvOS 18). The error suggests a network timeout (-1001) and a socket connection issue ("Socket is not connected").
Questions:
Is this a known issue with tvOS 18 on real devices? Are there any specific settings or configurations in tvOS 18 that could be causing the timeout error in non-debug mode? Could this be related to how URLSession or networking behaves differently in release mode? I would appreciate any help or insights into this issue!
In our application, we store user information (Username, Password, accessToken, Refresh token, etc.) in the keychain. However, after performing a hard reboot (unplugging and plugging back in), when we attempt to retrieve the ‘refresh token’ or ‘access token’ from the keychain, we receive the old token instead of the newly saved one.
I have an audio app that can play audio on an AirPlay device.
On non-Apple TV devices, the AirPlay app (on Roku, Samsung, etc.) shows the now playing metadata: title, artist, and album art.
However, on tvOS 18.1, no metadata is shown. The Apple TV device plays the audio, but there is no now playing information shown, nor any other indicators.
Other media apps show the "Now Playing" controls on the upper right of the tvOS home screen.
Can someone point me in the direction of how to solve this issue? I think I am missing something somewhere in regards to the tvOS metadata implementation.
Please help! I have a subscription IAP failing on tvOS 18.2 at:
func makePurchase(_ product: Product) async throws
{
let result = try await product.purchase() //ERROR OCCURS HERE (See error message below)
...
Xcode Console message: "Could not get confirmation scene ID for [insert my IAP id here]"
The IAP subscription was working fine on 18.1 and earlier, and the same IAP and code is also running fine on iOS 18.2. The tvOS error on 18.2 happens both in production and sandbox.
Are there any changes to StoreKit 2 which might cause this error?
I am a developer working on tvOS apps for Enterprise.
I would like to report an issue occurring in tvOS 18.1.
Workflow :
Exporting tvOS build from Xcode (Enterprise).
We have developed another Mac App to white label the tvOS App.
We are changing its Assets and bundle identifier of the app while white labeling process.
Resigning app again.
Distributing through JAMF.
Problem:
We have been facing a problem for a while.
The top shelf is not working in our white -labeled app and app icon is not appearing.
BTW App is opening and doing well.
I'm just getting following lines in console (Apple TV Device Console).
MDMProvisoningProfileTrust could not find record of managed app 'com.company.connecttv.events' with error: Error Domain=NSOSStatusErrorDomain Code=-10814 "(null)" UserInfo={_LSLine=1734, _LSFunction=runEvaluator}
Unable to create LSApplicationRecord for application: com.company.connecttv.events. error=Error Domain=NSOSStatusErrorDomain Code=-10814 UserInfo={_LSLine=1734, _LSFunction=<private>}
Top shelf content doesn't conform to TVTopShelfContentPrivate. It will not be ignored
[xpcservice<com.company.connecttv.events.Top-Shelf([app<com.apple.HeadBoard(E4023AC5-8955-4943-8AAF-9C8246A9F003)>:156])>{vt hash: 0}:328] Memorystatus failed with unexpected error: Invalid argument (22)
<0x302e2eee0-com.company.connecttv.events>: Error returned on remoteObjectProxy for loadTopShelfContentAndDelegateFlags message. error=Error Domain=NSCocoaErrorDomain Code=4097 UserInfo={NSDebugDescription=<private>}
Process state is unknown AppStateTracker, pid 328, uuid 809A68F1-76D2-396E-9CE0-5C16058FE354 display identifier com.company.connecttv.events.Top-Shelf foreground 0
Systen Information:
Machine : MacBook Pro / M1 Pro / 16 GB / 16-inch / 2021
Mac OS : Version 15.0 (15.0)
Xcode : Version 16.1 (16B40)
Apple TV Information:
tvOS : 18.1 (22J580)
Model : Apple TV 4K (3rd generation)
Capacity : 128 GB
I'm looking to develop an iOS application that functions as a remote for Apple TV, including discovering Apple TV devices over Wi-Fi. If anyone has experience building similar applications, could you share insights on available frameworks or protocols to discover Apple TVs? Additionally, if there are reference apps on the App Store that work like Apple's default remote app, I would greatly appreciate recommendations.
Any guidance from developers who have worked on similar projects would be very helpful!
After updating to tvOS 18 I've noticed the following change in focus behavior:
When a UICollectionView is located next to any other view (even one with user interaction disabled), attempting to move focus within the collection view in that view's direction to a next collection view item, which is currently outside of the visible area, doesn't have any effect.
Like if the focus engine preferred to move focus to that other view (despite it being not focusable) over a currently hidden collection view cell.
This didn't happen in tvOS 17 and earlier.
A minimal reproducible project can be found here.
Steps to reproduce:
Launch the app on a tvOS 18 device.
Swipe right and observe that focus moves without issues.
Swipe left and observe that in most cases focus doesn't move to items beyond the visible area due to an empty grey view located to the left from the collection view.
Compare to the same app running on tvOS 17 and observe that focus moves both sides without issues.
The issue is reproducible on Xcode versions 16.0 (16A242d) and 15.4 (15F31d).
Did anyone face the same issue and/or has suggestions on a possible fix?
When the native info panel (which displays the title, subtitle, description, and custom buttons) opens, the focus immediately shifts to the first button. As a result, VoiceOver skips the description, which is crucial for users relying on accessibility features.
I haven’t found a way to detect when it opens. Knowing this would allow me to trigger custom VoiceOver announcements or adjust the focus order dynamically.
Are any other people experiencing this issue, and how do we solve it?
Hello everyone,
Recently, I have encountered an issue in my tvOS app where a specific property of UITextField, isSecureTextEntry, set to true, was preventing another property, keyboardType, from functioning correctly. In my case, keyboardType is set to numberPad option.
The problem is that during the first tap on the text field, the default keyboard with numbers, letters, and some special characters opens. However, after the second tap, the correct keyboard type with only numbers appears as I want. Removing isSecureTextEntry or setting to false solves the problem.
import UIKit
class ViewController: UIViewController {
private let textField = UITextField()
override func viewDidLoad() {
super.viewDidLoad()
textField.keyboardType = .numberPad
textField.isSecureTextEntry = true
view.addSubview(textField)
setupConstraints()
}
}
I am familiar with providing the main layered app icon for a tvOS project, but I cannot work out how to successfully add support for alternative layered icons in the info.plist and/or asset catalog and/or build settings for a SwiftUI (or Swift) tvOS project.
Whenever I check the result of UIApplication.shared.supportsAlternateIcons it returns false
I have witnessed other tvOS apps such as Plex successfully switch the app icon, so it must be possible to do.
How should the project be configured, what am I missing?
Hi,
We are going to create a tvOS App with portrait display(HDMI screen will rotate 90 degree).
It seems there is no rotate setting in tvOS18, neither Xcode provide relative support.
As our investigation, we might need to rotate each UIKit component 90 degree by code to archive it.
Is there any better suggestion?
Thanks.
In tvOS 18 the onMoveCommand is missing the first press after a view is loaded and every time the direction is changed. It also misses the first press on a button after a focus change. This appears to only impact the newer silver remote and not the older black remote or IR remotes.
With the code bellow press any direction 3 times and it will only log twice.
struct ButtonTest: View {
var body: some View {
VStack {
Button {
debugPrint("button 1")
} label: {
Text("Button 1")
}
Button {
debugPrint("button 2")
} label: {
Text("Button 2")
}
Button {
debugPrint("button 3")
} label: {
Text("Button 3")
}
}
.onMoveCommand(perform: { direction in
debugPrint("move \(direction)")
})
.padding()
}
}
tvOS 18 doesn't provide passthrough of multichannel audio for streaming apps offering content where it it promoted as available. This is true for devices for which the functionality existed before the 18.0 tvOS update. What's more, the 18.1 Public Beta did not provide a resolution for the issue. All streaming apps appear to be affected. Notably, Home Sharing does not appear to be affected, and continues to provide multichannel audio as it did before the 18.0 update.